2026-04-07 00:00:07.801391 | Job console starting 2026-04-07 00:00:07.817933 | Updating git repos 2026-04-07 00:00:07.906196 | Cloning repos into workspace 2026-04-07 00:00:08.163616 | Restoring repo states 2026-04-07 00:00:08.188670 | Merging changes 2026-04-07 00:00:08.188691 | Checking out repos 2026-04-07 00:00:08.734741 | Preparing playbooks 2026-04-07 00:00:09.786967 | Running Ansible setup 2026-04-07 00:00:17.874239 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2026-04-07 00:00:19.633233 | 2026-04-07 00:00:19.633360 | PLAY [Base pre] 2026-04-07 00:00:19.666158 | 2026-04-07 00:00:19.670933 | TASK [Setup log path fact] 2026-04-07 00:00:19.707782 | orchestrator | ok 2026-04-07 00:00:19.764011 | 2026-04-07 00:00:19.764132 | TASK [set-zuul-log-path-fact : Set log path for a build] 2026-04-07 00:00:19.831388 | orchestrator | ok 2026-04-07 00:00:19.866160 | 2026-04-07 00:00:19.866371 | TASK [emit-job-header : Print job information] 2026-04-07 00:00:19.918934 | # Job Information 2026-04-07 00:00:19.919091 | Ansible Version: 2.16.14 2026-04-07 00:00:19.919125 | Job: testbed-deploy-stable-in-a-nutshell-with-tempest-ubuntu-24.04 2026-04-07 00:00:19.919159 | Pipeline: periodic-midnight 2026-04-07 00:00:19.919182 | Executor: 521e9411259a 2026-04-07 00:00:19.919202 | Triggered by: https://github.com/osism/testbed 2026-04-07 00:00:19.919224 | Event ID: 319058190fd34c37a7841e4813e72f7e 2026-04-07 00:00:19.934351 | 2026-04-07 00:00:19.934491 | LOOP [emit-job-header : Print node information] 2026-04-07 00:00:20.204663 | orchestrator | ok: 2026-04-07 00:00:20.204869 | orchestrator | # Node Information 2026-04-07 00:00:20.204903 | orchestrator | Inventory Hostname: orchestrator 2026-04-07 00:00:20.204924 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2026-04-07 00:00:20.204942 | orchestrator | Username: zuul-testbed02 2026-04-07 00:00:20.204959 | orchestrator | Distro: Debian 12.13 2026-04-07 00:00:20.204978 | orchestrator | Provider: static-testbed 2026-04-07 00:00:20.204996 | orchestrator | Region: 2026-04-07 00:00:20.205013 | orchestrator | Label: testbed-orchestrator 2026-04-07 00:00:20.205029 | orchestrator | Product Name: OpenStack Nova 2026-04-07 00:00:20.205045 | orchestrator | Interface IP: 81.163.193.140 2026-04-07 00:00:20.243033 | 2026-04-07 00:00:20.243151 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2026-04-07 00:00:21.339704 | orchestrator -> localhost | changed 2026-04-07 00:00:21.346598 | 2026-04-07 00:00:21.346689 | TASK [log-inventory : Copy ansible inventory to logs dir] 2026-04-07 00:00:23.739981 | orchestrator -> localhost | changed 2026-04-07 00:00:23.767888 | 2026-04-07 00:00:23.768041 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2026-04-07 00:00:24.680383 | orchestrator -> localhost | ok 2026-04-07 00:00:24.686113 | 2026-04-07 00:00:24.686202 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2026-04-07 00:00:24.713671 | orchestrator | ok 2026-04-07 00:00:24.776034 | orchestrator | included: /var/lib/zuul/builds/aef34ac854674087aa01508de92070da/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2026-04-07 00:00:24.806569 | 2026-04-07 00:00:24.806667 | TASK [add-build-sshkey : Create Temp SSH key] 2026-04-07 00:00:28.553495 | orchestrator -> localhost | Generating public/private rsa key pair. 2026-04-07 00:00:28.553659 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/aef34ac854674087aa01508de92070da/work/aef34ac854674087aa01508de92070da_id_rsa 2026-04-07 00:00:28.553692 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/aef34ac854674087aa01508de92070da/work/aef34ac854674087aa01508de92070da_id_rsa.pub 2026-04-07 00:00:28.553715 | orchestrator -> localhost | The key fingerprint is: 2026-04-07 00:00:28.553735 | orchestrator -> localhost | SHA256:UIMmunZjBMGiPpBDJ+4XE/D/W8X9+B7t1pQ58OHOfyw zuul-build-sshkey 2026-04-07 00:00:28.553755 | orchestrator -> localhost | The key's randomart image is: 2026-04-07 00:00:28.553783 | orchestrator -> localhost | +---[RSA 3072]----+ 2026-04-07 00:00:28.553802 | orchestrator -> localhost | | oo. .o | 2026-04-07 00:00:28.553821 | orchestrator -> localhost | |.o+o. o. . | 2026-04-07 00:00:28.553838 | orchestrator -> localhost | |+oo+.o. | 2026-04-07 00:00:28.553856 | orchestrator -> localhost | |=..oo . . .. . | 2026-04-07 00:00:28.553873 | orchestrator -> localhost | |+. oo. S o .+ +| 2026-04-07 00:00:28.553897 | orchestrator -> localhost | | +o.+ . . o*o| 2026-04-07 00:00:28.553915 | orchestrator -> localhost | | .oo . . . .o=+| 2026-04-07 00:00:28.553931 | orchestrator -> localhost | | o E+*| 2026-04-07 00:00:28.553949 | orchestrator -> localhost | | . .==| 2026-04-07 00:00:28.553965 | orchestrator -> localhost | +----[SHA256]-----+ 2026-04-07 00:00:28.554008 | orchestrator -> localhost | ok: Runtime: 0:00:02.741014 2026-04-07 00:00:28.559923 | 2026-04-07 00:00:28.559998 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2026-04-07 00:00:28.620501 | orchestrator | ok 2026-04-07 00:00:28.639552 | orchestrator | included: /var/lib/zuul/builds/aef34ac854674087aa01508de92070da/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2026-04-07 00:00:28.689186 | 2026-04-07 00:00:28.689285 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2026-04-07 00:00:28.753389 | orchestrator | skipping: Conditional result was False 2026-04-07 00:00:28.762147 | 2026-04-07 00:00:28.762238 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2026-04-07 00:00:29.620399 | orchestrator | changed 2026-04-07 00:00:29.640541 | 2026-04-07 00:00:29.640638 | TASK [add-build-sshkey : Make sure user has a .ssh] 2026-04-07 00:00:29.946990 | orchestrator | ok 2026-04-07 00:00:29.958048 | 2026-04-07 00:00:29.958158 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2026-04-07 00:00:30.530608 | orchestrator | ok 2026-04-07 00:00:30.548325 | 2026-04-07 00:00:30.548444 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2026-04-07 00:00:31.051588 | orchestrator | ok 2026-04-07 00:00:31.061078 | 2026-04-07 00:00:31.061180 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2026-04-07 00:00:31.141348 | orchestrator | skipping: Conditional result was False 2026-04-07 00:00:31.148594 | 2026-04-07 00:00:31.148722 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2026-04-07 00:00:32.489583 | orchestrator -> localhost | changed 2026-04-07 00:00:32.502571 | 2026-04-07 00:00:32.502675 | TASK [add-build-sshkey : Add back temp key] 2026-04-07 00:00:33.687194 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/aef34ac854674087aa01508de92070da/work/aef34ac854674087aa01508de92070da_id_rsa (zuul-build-sshkey) 2026-04-07 00:00:33.687420 | orchestrator -> localhost | ok: Runtime: 0:00:00.044083 2026-04-07 00:00:33.697983 | 2026-04-07 00:00:33.698081 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2026-04-07 00:00:34.557583 | orchestrator | ok 2026-04-07 00:00:34.579197 | 2026-04-07 00:00:34.579297 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2026-04-07 00:00:34.635198 | orchestrator | skipping: Conditional result was False 2026-04-07 00:00:34.737526 | 2026-04-07 00:00:34.737618 | TASK [start-zuul-console : Start zuul_console daemon.] 2026-04-07 00:00:35.478130 | orchestrator | ok 2026-04-07 00:00:35.501387 | 2026-04-07 00:00:35.501507 | TASK [validate-host : Define zuul_info_dir fact] 2026-04-07 00:00:35.539619 | orchestrator | ok 2026-04-07 00:00:35.549455 | 2026-04-07 00:00:35.549600 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2026-04-07 00:00:36.474509 | orchestrator -> localhost | ok 2026-04-07 00:00:36.480375 | 2026-04-07 00:00:36.480467 | TASK [validate-host : Collect information about the host] 2026-04-07 00:00:37.867377 | orchestrator | ok 2026-04-07 00:00:37.910102 | 2026-04-07 00:00:37.910227 | TASK [validate-host : Sanitize hostname] 2026-04-07 00:00:38.147769 | orchestrator | ok 2026-04-07 00:00:38.152609 | 2026-04-07 00:00:38.152696 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2026-04-07 00:00:39.411759 | orchestrator -> localhost | changed 2026-04-07 00:00:39.416913 | 2026-04-07 00:00:39.416998 | TASK [validate-host : Collect information about zuul worker] 2026-04-07 00:00:40.153603 | orchestrator | ok 2026-04-07 00:00:40.157933 | 2026-04-07 00:00:40.158019 | TASK [validate-host : Write out all zuul information for each host] 2026-04-07 00:00:41.553708 | orchestrator -> localhost | changed 2026-04-07 00:00:41.562166 | 2026-04-07 00:00:41.562257 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2026-04-07 00:00:41.899964 | orchestrator | ok 2026-04-07 00:00:41.904728 | 2026-04-07 00:00:41.904807 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2026-04-07 00:02:11.007276 | orchestrator | changed: 2026-04-07 00:02:11.007546 | orchestrator | .d..t...... src/ 2026-04-07 00:02:11.007584 | orchestrator | .d..t...... src/github.com/ 2026-04-07 00:02:11.007608 | orchestrator | .d..t...... src/github.com/osism/ 2026-04-07 00:02:11.007630 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2026-04-07 00:02:11.007651 | orchestrator | RedHat.yml 2026-04-07 00:02:11.024183 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2026-04-07 00:02:11.024200 | orchestrator | RedHat.yml 2026-04-07 00:02:11.024254 | orchestrator | = 2.2.0"... 2026-04-07 00:02:22.584901 | orchestrator | - Finding latest version of hashicorp/null... 2026-04-07 00:02:22.600893 | orchestrator | - Finding terraform-provider-openstack/openstack versions matching ">= 1.53.0"... 2026-04-07 00:02:22.751736 | orchestrator | - Installing hashicorp/local v2.8.0... 2026-04-07 00:02:23.306196 | orchestrator | - Installed hashicorp/local v2.8.0 (signed, key ID 0C0AF313E5FD9F80) 2026-04-07 00:02:23.375531 | orchestrator | - Installing hashicorp/null v3.2.4... 2026-04-07 00:02:23.859920 | orchestrator | - Installed hashicorp/null v3.2.4 (signed, key ID 0C0AF313E5FD9F80) 2026-04-07 00:02:23.925162 | orchestrator | - Installing terraform-provider-openstack/openstack v3.4.0... 2026-04-07 00:02:24.803857 | orchestrator | - Installed terraform-provider-openstack/openstack v3.4.0 (signed, key ID 4F80527A391BEFD2) 2026-04-07 00:02:24.803913 | orchestrator | 2026-04-07 00:02:24.803920 | orchestrator | Providers are signed by their developers. 2026-04-07 00:02:24.803926 | orchestrator | If you'd like to know more about provider signing, you can read about it here: 2026-04-07 00:02:24.803932 | orchestrator | https://opentofu.org/docs/cli/plugins/signing/ 2026-04-07 00:02:24.803947 | orchestrator | 2026-04-07 00:02:24.803952 | orchestrator | OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2026-04-07 00:02:24.803963 | orchestrator | selections it made above. Include this file in your version control repository 2026-04-07 00:02:24.803968 | orchestrator | so that OpenTofu can guarantee to make the same selections by default when 2026-04-07 00:02:24.803973 | orchestrator | you run "tofu init" in the future. 2026-04-07 00:02:24.804982 | orchestrator | 2026-04-07 00:02:24.805355 | orchestrator | OpenTofu has been successfully initialized! 2026-04-07 00:02:24.805394 | orchestrator | 2026-04-07 00:02:24.805414 | orchestrator | You may now begin working with OpenTofu. Try running "tofu plan" to see 2026-04-07 00:02:24.805433 | orchestrator | any changes that are required for your infrastructure. All OpenTofu commands 2026-04-07 00:02:24.805452 | orchestrator | should now work. 2026-04-07 00:02:24.805471 | orchestrator | 2026-04-07 00:02:24.805491 | orchestrator | If you ever set or change modules or backend configuration for OpenTofu, 2026-04-07 00:02:24.805510 | orchestrator | rerun this command to reinitialize your working directory. If you forget, other 2026-04-07 00:02:24.805529 | orchestrator | commands will detect it and remind you to do so if necessary. 2026-04-07 00:02:25.362000 | orchestrator | Created and switched to workspace "ci"! 2026-04-07 00:02:25.362279 | orchestrator | 2026-04-07 00:02:25.362295 | orchestrator | You're now on a new, empty workspace. Workspaces isolate their state, 2026-04-07 00:02:25.362301 | orchestrator | so if you run "tofu plan" OpenTofu will not see any existing state 2026-04-07 00:02:25.362307 | orchestrator | for this configuration. 2026-04-07 00:02:25.883690 | orchestrator | ci.auto.tfvars 2026-04-07 00:02:26.477754 | orchestrator | default_custom.tf 2026-04-07 00:02:27.485442 | orchestrator | data.openstack_networking_network_v2.public: Reading... 2026-04-07 00:02:28.095455 | orchestrator | data.openstack_networking_network_v2.public: Read complete after 1s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2026-04-07 00:02:28.495779 | orchestrator | 2026-04-07 00:02:28.495845 | orchestrator | OpenTofu used the selected providers to generate the following execution 2026-04-07 00:02:28.495852 | orchestrator | plan. Resource actions are indicated with the following symbols: 2026-04-07 00:02:28.495864 | orchestrator | + create 2026-04-07 00:02:28.495869 | orchestrator | <= read (data resources) 2026-04-07 00:02:28.495874 | orchestrator | 2026-04-07 00:02:28.495879 | orchestrator | OpenTofu will perform the following actions: 2026-04-07 00:02:28.495886 | orchestrator | 2026-04-07 00:02:28.495891 | orchestrator | # data.openstack_images_image_v2.image will be read during apply 2026-04-07 00:02:28.495895 | orchestrator | # (config refers to values not yet known) 2026-04-07 00:02:28.495900 | orchestrator | <= data "openstack_images_image_v2" "image" { 2026-04-07 00:02:28.495904 | orchestrator | + checksum = (known after apply) 2026-04-07 00:02:28.495908 | orchestrator | + created_at = (known after apply) 2026-04-07 00:02:28.495912 | orchestrator | + file = (known after apply) 2026-04-07 00:02:28.495916 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.495940 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.495945 | orchestrator | + min_disk_gb = (known after apply) 2026-04-07 00:02:28.495949 | orchestrator | + min_ram_mb = (known after apply) 2026-04-07 00:02:28.495953 | orchestrator | + most_recent = true 2026-04-07 00:02:28.495957 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.495962 | orchestrator | + protected = (known after apply) 2026-04-07 00:02:28.495966 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.495973 | orchestrator | + schema = (known after apply) 2026-04-07 00:02:28.495977 | orchestrator | + size_bytes = (known after apply) 2026-04-07 00:02:28.495982 | orchestrator | + tags = (known after apply) 2026-04-07 00:02:28.495986 | orchestrator | + updated_at = (known after apply) 2026-04-07 00:02:28.496002 | orchestrator | } 2026-04-07 00:02:28.496009 | orchestrator | 2026-04-07 00:02:28.496013 | orchestrator | # data.openstack_images_image_v2.image_node will be read during apply 2026-04-07 00:02:28.496017 | orchestrator | # (config refers to values not yet known) 2026-04-07 00:02:28.496021 | orchestrator | <= data "openstack_images_image_v2" "image_node" { 2026-04-07 00:02:28.496026 | orchestrator | + checksum = (known after apply) 2026-04-07 00:02:28.496030 | orchestrator | + created_at = (known after apply) 2026-04-07 00:02:28.496034 | orchestrator | + file = (known after apply) 2026-04-07 00:02:28.496038 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496042 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.496046 | orchestrator | + min_disk_gb = (known after apply) 2026-04-07 00:02:28.496050 | orchestrator | + min_ram_mb = (known after apply) 2026-04-07 00:02:28.496055 | orchestrator | + most_recent = true 2026-04-07 00:02:28.496059 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.496062 | orchestrator | + protected = (known after apply) 2026-04-07 00:02:28.496066 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.496070 | orchestrator | + schema = (known after apply) 2026-04-07 00:02:28.496074 | orchestrator | + size_bytes = (known after apply) 2026-04-07 00:02:28.496077 | orchestrator | + tags = (known after apply) 2026-04-07 00:02:28.496081 | orchestrator | + updated_at = (known after apply) 2026-04-07 00:02:28.496085 | orchestrator | } 2026-04-07 00:02:28.496136 | orchestrator | 2026-04-07 00:02:28.496143 | orchestrator | # local_file.MANAGER_ADDRESS will be created 2026-04-07 00:02:28.496148 | orchestrator | + resource "local_file" "MANAGER_ADDRESS" { 2026-04-07 00:02:28.496152 | orchestrator | + content = (known after apply) 2026-04-07 00:02:28.496157 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-07 00:02:28.496161 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-07 00:02:28.496166 | orchestrator | + content_md5 = (known after apply) 2026-04-07 00:02:28.496170 | orchestrator | + content_sha1 = (known after apply) 2026-04-07 00:02:28.496174 | orchestrator | + content_sha256 = (known after apply) 2026-04-07 00:02:28.496179 | orchestrator | + content_sha512 = (known after apply) 2026-04-07 00:02:28.496183 | orchestrator | + directory_permission = "0777" 2026-04-07 00:02:28.496187 | orchestrator | + file_permission = "0644" 2026-04-07 00:02:28.496192 | orchestrator | + filename = ".MANAGER_ADDRESS.ci" 2026-04-07 00:02:28.496196 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496200 | orchestrator | } 2026-04-07 00:02:28.496234 | orchestrator | 2026-04-07 00:02:28.496240 | orchestrator | # local_file.id_rsa_pub will be created 2026-04-07 00:02:28.496244 | orchestrator | + resource "local_file" "id_rsa_pub" { 2026-04-07 00:02:28.496248 | orchestrator | + content = (known after apply) 2026-04-07 00:02:28.496261 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-07 00:02:28.496265 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-07 00:02:28.496268 | orchestrator | + content_md5 = (known after apply) 2026-04-07 00:02:28.496272 | orchestrator | + content_sha1 = (known after apply) 2026-04-07 00:02:28.496276 | orchestrator | + content_sha256 = (known after apply) 2026-04-07 00:02:28.496284 | orchestrator | + content_sha512 = (known after apply) 2026-04-07 00:02:28.496288 | orchestrator | + directory_permission = "0777" 2026-04-07 00:02:28.496292 | orchestrator | + file_permission = "0644" 2026-04-07 00:02:28.496301 | orchestrator | + filename = ".id_rsa.ci.pub" 2026-04-07 00:02:28.496305 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496308 | orchestrator | } 2026-04-07 00:02:28.496421 | orchestrator | 2026-04-07 00:02:28.496432 | orchestrator | # local_file.inventory will be created 2026-04-07 00:02:28.496438 | orchestrator | + resource "local_file" "inventory" { 2026-04-07 00:02:28.496443 | orchestrator | + content = (known after apply) 2026-04-07 00:02:28.496449 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-07 00:02:28.496455 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-07 00:02:28.496460 | orchestrator | + content_md5 = (known after apply) 2026-04-07 00:02:28.496465 | orchestrator | + content_sha1 = (known after apply) 2026-04-07 00:02:28.496471 | orchestrator | + content_sha256 = (known after apply) 2026-04-07 00:02:28.496486 | orchestrator | + content_sha512 = (known after apply) 2026-04-07 00:02:28.496493 | orchestrator | + directory_permission = "0777" 2026-04-07 00:02:28.496499 | orchestrator | + file_permission = "0644" 2026-04-07 00:02:28.496505 | orchestrator | + filename = "inventory.ci" 2026-04-07 00:02:28.496510 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496516 | orchestrator | } 2026-04-07 00:02:28.496525 | orchestrator | 2026-04-07 00:02:28.496532 | orchestrator | # local_sensitive_file.id_rsa will be created 2026-04-07 00:02:28.496538 | orchestrator | + resource "local_sensitive_file" "id_rsa" { 2026-04-07 00:02:28.496544 | orchestrator | + content = (sensitive value) 2026-04-07 00:02:28.496551 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-07 00:02:28.496557 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-07 00:02:28.496565 | orchestrator | + content_md5 = (known after apply) 2026-04-07 00:02:28.496569 | orchestrator | + content_sha1 = (known after apply) 2026-04-07 00:02:28.496573 | orchestrator | + content_sha256 = (known after apply) 2026-04-07 00:02:28.496576 | orchestrator | + content_sha512 = (known after apply) 2026-04-07 00:02:28.496580 | orchestrator | + directory_permission = "0700" 2026-04-07 00:02:28.496584 | orchestrator | + file_permission = "0600" 2026-04-07 00:02:28.496588 | orchestrator | + filename = ".id_rsa.ci" 2026-04-07 00:02:28.496592 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496595 | orchestrator | } 2026-04-07 00:02:28.496601 | orchestrator | 2026-04-07 00:02:28.496605 | orchestrator | # null_resource.node_semaphore will be created 2026-04-07 00:02:28.496608 | orchestrator | + resource "null_resource" "node_semaphore" { 2026-04-07 00:02:28.496612 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496616 | orchestrator | } 2026-04-07 00:02:28.496621 | orchestrator | 2026-04-07 00:02:28.496625 | orchestrator | # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2026-04-07 00:02:28.496629 | orchestrator | + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2026-04-07 00:02:28.496633 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.496637 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.496641 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.496644 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.496648 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.496652 | orchestrator | + name = "testbed-volume-manager-base" 2026-04-07 00:02:28.496655 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.496659 | orchestrator | + size = 80 2026-04-07 00:02:28.496663 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.496667 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.496671 | orchestrator | } 2026-04-07 00:02:28.506118 | orchestrator | 2026-04-07 00:02:28.506175 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2026-04-07 00:02:28.506181 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-07 00:02:28.506186 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506190 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506195 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506213 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.506218 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506221 | orchestrator | + name = "testbed-volume-0-node-base" 2026-04-07 00:02:28.506225 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506229 | orchestrator | + size = 80 2026-04-07 00:02:28.506233 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506237 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506240 | orchestrator | } 2026-04-07 00:02:28.506244 | orchestrator | 2026-04-07 00:02:28.506248 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2026-04-07 00:02:28.506252 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-07 00:02:28.506257 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506261 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506265 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506269 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.506272 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506276 | orchestrator | + name = "testbed-volume-1-node-base" 2026-04-07 00:02:28.506280 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506283 | orchestrator | + size = 80 2026-04-07 00:02:28.506287 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506291 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506295 | orchestrator | } 2026-04-07 00:02:28.506299 | orchestrator | 2026-04-07 00:02:28.506302 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2026-04-07 00:02:28.506306 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-07 00:02:28.506310 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506314 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506317 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506321 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.506325 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506329 | orchestrator | + name = "testbed-volume-2-node-base" 2026-04-07 00:02:28.506333 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506336 | orchestrator | + size = 80 2026-04-07 00:02:28.506345 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506349 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506352 | orchestrator | } 2026-04-07 00:02:28.506356 | orchestrator | 2026-04-07 00:02:28.506360 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2026-04-07 00:02:28.506364 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-07 00:02:28.506367 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506371 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506375 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506378 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.506382 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506386 | orchestrator | + name = "testbed-volume-3-node-base" 2026-04-07 00:02:28.506389 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506393 | orchestrator | + size = 80 2026-04-07 00:02:28.506397 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506401 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506404 | orchestrator | } 2026-04-07 00:02:28.506408 | orchestrator | 2026-04-07 00:02:28.506412 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2026-04-07 00:02:28.506415 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-07 00:02:28.506420 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506423 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506427 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506434 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.506438 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506442 | orchestrator | + name = "testbed-volume-4-node-base" 2026-04-07 00:02:28.506445 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506449 | orchestrator | + size = 80 2026-04-07 00:02:28.506453 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506456 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506460 | orchestrator | } 2026-04-07 00:02:28.506464 | orchestrator | 2026-04-07 00:02:28.506468 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2026-04-07 00:02:28.506471 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-07 00:02:28.506475 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506479 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506482 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506486 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.506490 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506493 | orchestrator | + name = "testbed-volume-5-node-base" 2026-04-07 00:02:28.506497 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506501 | orchestrator | + size = 80 2026-04-07 00:02:28.506504 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506508 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506512 | orchestrator | } 2026-04-07 00:02:28.506515 | orchestrator | 2026-04-07 00:02:28.506519 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[0] will be created 2026-04-07 00:02:28.506526 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506530 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506533 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506537 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506541 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506545 | orchestrator | + name = "testbed-volume-0-node-3" 2026-04-07 00:02:28.506558 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506562 | orchestrator | + size = 20 2026-04-07 00:02:28.506566 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506569 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506573 | orchestrator | } 2026-04-07 00:02:28.506577 | orchestrator | 2026-04-07 00:02:28.506581 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[1] will be created 2026-04-07 00:02:28.506584 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506588 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506592 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506595 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506599 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506603 | orchestrator | + name = "testbed-volume-1-node-4" 2026-04-07 00:02:28.506606 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506610 | orchestrator | + size = 20 2026-04-07 00:02:28.506614 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506618 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506621 | orchestrator | } 2026-04-07 00:02:28.506625 | orchestrator | 2026-04-07 00:02:28.506629 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[2] will be created 2026-04-07 00:02:28.506632 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506636 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506640 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506643 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506647 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506651 | orchestrator | + name = "testbed-volume-2-node-5" 2026-04-07 00:02:28.506654 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506661 | orchestrator | + size = 20 2026-04-07 00:02:28.506665 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506668 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506672 | orchestrator | } 2026-04-07 00:02:28.506676 | orchestrator | 2026-04-07 00:02:28.506679 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[3] will be created 2026-04-07 00:02:28.506683 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506687 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506691 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506694 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506704 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506708 | orchestrator | + name = "testbed-volume-3-node-3" 2026-04-07 00:02:28.506712 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506716 | orchestrator | + size = 20 2026-04-07 00:02:28.506719 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506723 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506727 | orchestrator | } 2026-04-07 00:02:28.506731 | orchestrator | 2026-04-07 00:02:28.506734 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[4] will be created 2026-04-07 00:02:28.506738 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506742 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506745 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506749 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506753 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506757 | orchestrator | + name = "testbed-volume-4-node-4" 2026-04-07 00:02:28.506760 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506764 | orchestrator | + size = 20 2026-04-07 00:02:28.506768 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506772 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506775 | orchestrator | } 2026-04-07 00:02:28.506779 | orchestrator | 2026-04-07 00:02:28.506783 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[5] will be created 2026-04-07 00:02:28.506786 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506790 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506794 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.506798 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.506801 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.506805 | orchestrator | + name = "testbed-volume-5-node-5" 2026-04-07 00:02:28.506809 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.506812 | orchestrator | + size = 20 2026-04-07 00:02:28.506816 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.506820 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.506823 | orchestrator | } 2026-04-07 00:02:28.506827 | orchestrator | 2026-04-07 00:02:28.506831 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[6] will be created 2026-04-07 00:02:28.506835 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.506838 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.506842 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508044 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508052 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.508057 | orchestrator | + name = "testbed-volume-6-node-3" 2026-04-07 00:02:28.508061 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508066 | orchestrator | + size = 20 2026-04-07 00:02:28.508070 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.508075 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.508080 | orchestrator | } 2026-04-07 00:02:28.508084 | orchestrator | 2026-04-07 00:02:28.508088 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[7] will be created 2026-04-07 00:02:28.508092 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.508100 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.508104 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508108 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508112 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.508116 | orchestrator | + name = "testbed-volume-7-node-4" 2026-04-07 00:02:28.508119 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508123 | orchestrator | + size = 20 2026-04-07 00:02:28.508127 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.508130 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.508134 | orchestrator | } 2026-04-07 00:02:28.508138 | orchestrator | 2026-04-07 00:02:28.508142 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[8] will be created 2026-04-07 00:02:28.508145 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-07 00:02:28.508155 | orchestrator | + attachment = (known after apply) 2026-04-07 00:02:28.508159 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508163 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508167 | orchestrator | + metadata = (known after apply) 2026-04-07 00:02:28.508171 | orchestrator | + name = "testbed-volume-8-node-5" 2026-04-07 00:02:28.508174 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508178 | orchestrator | + size = 20 2026-04-07 00:02:28.508182 | orchestrator | + volume_retype_policy = "never" 2026-04-07 00:02:28.508185 | orchestrator | + volume_type = "ssd" 2026-04-07 00:02:28.508189 | orchestrator | } 2026-04-07 00:02:28.508193 | orchestrator | 2026-04-07 00:02:28.508197 | orchestrator | # openstack_compute_instance_v2.manager_server will be created 2026-04-07 00:02:28.508200 | orchestrator | + resource "openstack_compute_instance_v2" "manager_server" { 2026-04-07 00:02:28.508204 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.508208 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.508211 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.508215 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.508219 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508222 | orchestrator | + config_drive = true 2026-04-07 00:02:28.508229 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.508233 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.508237 | orchestrator | + flavor_name = "OSISM-4V-16" 2026-04-07 00:02:28.508240 | orchestrator | + force_delete = false 2026-04-07 00:02:28.508244 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.508248 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508251 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.508255 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.508259 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.508263 | orchestrator | + name = "testbed-manager" 2026-04-07 00:02:28.508267 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.508270 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508274 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.508277 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.508281 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.508285 | orchestrator | + user_data = (sensitive value) 2026-04-07 00:02:28.508289 | orchestrator | 2026-04-07 00:02:28.508293 | orchestrator | + block_device { 2026-04-07 00:02:28.508297 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.508300 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.508304 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.508308 | orchestrator | + multiattach = false 2026-04-07 00:02:28.508311 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.508315 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508323 | orchestrator | } 2026-04-07 00:02:28.508327 | orchestrator | 2026-04-07 00:02:28.508331 | orchestrator | + network { 2026-04-07 00:02:28.508334 | orchestrator | + access_network = false 2026-04-07 00:02:28.508338 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.508342 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.508345 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.508349 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.508353 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.508356 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508360 | orchestrator | } 2026-04-07 00:02:28.508364 | orchestrator | } 2026-04-07 00:02:28.508367 | orchestrator | 2026-04-07 00:02:28.508371 | orchestrator | # openstack_compute_instance_v2.node_server[0] will be created 2026-04-07 00:02:28.508375 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-07 00:02:28.508379 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.508382 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.508386 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.508390 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.508393 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508397 | orchestrator | + config_drive = true 2026-04-07 00:02:28.508401 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.508404 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.508408 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-07 00:02:28.508412 | orchestrator | + force_delete = false 2026-04-07 00:02:28.508415 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.508419 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508423 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.508426 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.508430 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.508434 | orchestrator | + name = "testbed-node-0" 2026-04-07 00:02:28.508437 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.508441 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508445 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.508448 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.508452 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.508456 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-07 00:02:28.508459 | orchestrator | 2026-04-07 00:02:28.508463 | orchestrator | + block_device { 2026-04-07 00:02:28.508467 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.508471 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.508474 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.508478 | orchestrator | + multiattach = false 2026-04-07 00:02:28.508482 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.508485 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508489 | orchestrator | } 2026-04-07 00:02:28.508493 | orchestrator | 2026-04-07 00:02:28.508496 | orchestrator | + network { 2026-04-07 00:02:28.508500 | orchestrator | + access_network = false 2026-04-07 00:02:28.508504 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.508507 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.508511 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.508515 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.508519 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.508522 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508526 | orchestrator | } 2026-04-07 00:02:28.508530 | orchestrator | } 2026-04-07 00:02:28.508533 | orchestrator | 2026-04-07 00:02:28.508540 | orchestrator | # openstack_compute_instance_v2.node_server[1] will be created 2026-04-07 00:02:28.508544 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-07 00:02:28.508548 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.508555 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.508559 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.508562 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.508566 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508570 | orchestrator | + config_drive = true 2026-04-07 00:02:28.508573 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.508577 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.508581 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-07 00:02:28.508584 | orchestrator | + force_delete = false 2026-04-07 00:02:28.508588 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.508592 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508595 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.508599 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.508603 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.508606 | orchestrator | + name = "testbed-node-1" 2026-04-07 00:02:28.508610 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.508614 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508618 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.508621 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.508625 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.508631 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-07 00:02:28.508635 | orchestrator | 2026-04-07 00:02:28.508639 | orchestrator | + block_device { 2026-04-07 00:02:28.508642 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.508646 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.508650 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.508653 | orchestrator | + multiattach = false 2026-04-07 00:02:28.508657 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.508661 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508664 | orchestrator | } 2026-04-07 00:02:28.508668 | orchestrator | 2026-04-07 00:02:28.508672 | orchestrator | + network { 2026-04-07 00:02:28.508676 | orchestrator | + access_network = false 2026-04-07 00:02:28.508679 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.508683 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.508687 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.508690 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.508694 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.508698 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508701 | orchestrator | } 2026-04-07 00:02:28.508705 | orchestrator | } 2026-04-07 00:02:28.508709 | orchestrator | 2026-04-07 00:02:28.508712 | orchestrator | # openstack_compute_instance_v2.node_server[2] will be created 2026-04-07 00:02:28.508716 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-07 00:02:28.508720 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.508723 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.508728 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.508731 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.508735 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508739 | orchestrator | + config_drive = true 2026-04-07 00:02:28.508742 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.508746 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.508750 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-07 00:02:28.508753 | orchestrator | + force_delete = false 2026-04-07 00:02:28.508757 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.508761 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508764 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.508771 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.508775 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.508778 | orchestrator | + name = "testbed-node-2" 2026-04-07 00:02:28.508782 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.508786 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508789 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.508793 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.508797 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.508800 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-07 00:02:28.508804 | orchestrator | 2026-04-07 00:02:28.508808 | orchestrator | + block_device { 2026-04-07 00:02:28.508811 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.508815 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.508819 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.508822 | orchestrator | + multiattach = false 2026-04-07 00:02:28.508826 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.508829 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508833 | orchestrator | } 2026-04-07 00:02:28.508837 | orchestrator | 2026-04-07 00:02:28.508840 | orchestrator | + network { 2026-04-07 00:02:28.508844 | orchestrator | + access_network = false 2026-04-07 00:02:28.508848 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.508851 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.508855 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.508859 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.508862 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.508866 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.508870 | orchestrator | } 2026-04-07 00:02:28.508873 | orchestrator | } 2026-04-07 00:02:28.508877 | orchestrator | 2026-04-07 00:02:28.508883 | orchestrator | # openstack_compute_instance_v2.node_server[3] will be created 2026-04-07 00:02:28.508887 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-07 00:02:28.508891 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.508894 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.508898 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.508902 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.508905 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.508909 | orchestrator | + config_drive = true 2026-04-07 00:02:28.508913 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.508918 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.508922 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-07 00:02:28.508926 | orchestrator | + force_delete = false 2026-04-07 00:02:28.508930 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.508933 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.508937 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.508941 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.508944 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.508948 | orchestrator | + name = "testbed-node-3" 2026-04-07 00:02:28.508952 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.508955 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.508959 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.508963 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.508966 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.508970 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-07 00:02:28.508974 | orchestrator | 2026-04-07 00:02:28.508977 | orchestrator | + block_device { 2026-04-07 00:02:28.508981 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.508985 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.509001 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.509008 | orchestrator | + multiattach = false 2026-04-07 00:02:28.509011 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.509015 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.509019 | orchestrator | } 2026-04-07 00:02:28.509023 | orchestrator | 2026-04-07 00:02:28.509026 | orchestrator | + network { 2026-04-07 00:02:28.509030 | orchestrator | + access_network = false 2026-04-07 00:02:28.509034 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.509037 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.509041 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.509045 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.509048 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.509052 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.509056 | orchestrator | } 2026-04-07 00:02:28.509059 | orchestrator | } 2026-04-07 00:02:28.509063 | orchestrator | 2026-04-07 00:02:28.509067 | orchestrator | # openstack_compute_instance_v2.node_server[4] will be created 2026-04-07 00:02:28.509071 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-07 00:02:28.509075 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.509078 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.509082 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.509086 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.509089 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.509093 | orchestrator | + config_drive = true 2026-04-07 00:02:28.509097 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.509100 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.509104 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-07 00:02:28.509108 | orchestrator | + force_delete = false 2026-04-07 00:02:28.509112 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.509115 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509119 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.509123 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.509126 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.509130 | orchestrator | + name = "testbed-node-4" 2026-04-07 00:02:28.509134 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.509137 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509141 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.509145 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.509148 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.509152 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-07 00:02:28.509156 | orchestrator | 2026-04-07 00:02:28.509160 | orchestrator | + block_device { 2026-04-07 00:02:28.509163 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.509167 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.509171 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.509175 | orchestrator | + multiattach = false 2026-04-07 00:02:28.509178 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.509182 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.509186 | orchestrator | } 2026-04-07 00:02:28.509189 | orchestrator | 2026-04-07 00:02:28.509193 | orchestrator | + network { 2026-04-07 00:02:28.509197 | orchestrator | + access_network = false 2026-04-07 00:02:28.509200 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.509204 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.509208 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.509211 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.509215 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.509219 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.509223 | orchestrator | } 2026-04-07 00:02:28.509226 | orchestrator | } 2026-04-07 00:02:28.509233 | orchestrator | 2026-04-07 00:02:28.509237 | orchestrator | # openstack_compute_instance_v2.node_server[5] will be created 2026-04-07 00:02:28.509241 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-07 00:02:28.509244 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-07 00:02:28.509248 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-07 00:02:28.509252 | orchestrator | + all_metadata = (known after apply) 2026-04-07 00:02:28.509255 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.509259 | orchestrator | + availability_zone = "nova" 2026-04-07 00:02:28.509262 | orchestrator | + config_drive = true 2026-04-07 00:02:28.509266 | orchestrator | + created = (known after apply) 2026-04-07 00:02:28.509270 | orchestrator | + flavor_id = (known after apply) 2026-04-07 00:02:28.509274 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-07 00:02:28.509277 | orchestrator | + force_delete = false 2026-04-07 00:02:28.509281 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-07 00:02:28.509285 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509288 | orchestrator | + image_id = (known after apply) 2026-04-07 00:02:28.509292 | orchestrator | + image_name = (known after apply) 2026-04-07 00:02:28.509295 | orchestrator | + key_pair = "testbed" 2026-04-07 00:02:28.509299 | orchestrator | + name = "testbed-node-5" 2026-04-07 00:02:28.509303 | orchestrator | + power_state = "active" 2026-04-07 00:02:28.509309 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509313 | orchestrator | + security_groups = (known after apply) 2026-04-07 00:02:28.509317 | orchestrator | + stop_before_destroy = false 2026-04-07 00:02:28.509320 | orchestrator | + updated = (known after apply) 2026-04-07 00:02:28.509324 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-07 00:02:28.509328 | orchestrator | 2026-04-07 00:02:28.509332 | orchestrator | + block_device { 2026-04-07 00:02:28.509335 | orchestrator | + boot_index = 0 2026-04-07 00:02:28.509339 | orchestrator | + delete_on_termination = false 2026-04-07 00:02:28.509343 | orchestrator | + destination_type = "volume" 2026-04-07 00:02:28.509346 | orchestrator | + multiattach = false 2026-04-07 00:02:28.509350 | orchestrator | + source_type = "volume" 2026-04-07 00:02:28.509354 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.509357 | orchestrator | } 2026-04-07 00:02:28.509361 | orchestrator | 2026-04-07 00:02:28.509365 | orchestrator | + network { 2026-04-07 00:02:28.509368 | orchestrator | + access_network = false 2026-04-07 00:02:28.509372 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-07 00:02:28.509376 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-07 00:02:28.509379 | orchestrator | + mac = (known after apply) 2026-04-07 00:02:28.509383 | orchestrator | + name = (known after apply) 2026-04-07 00:02:28.509387 | orchestrator | + port = (known after apply) 2026-04-07 00:02:28.509391 | orchestrator | + uuid = (known after apply) 2026-04-07 00:02:28.509394 | orchestrator | } 2026-04-07 00:02:28.509398 | orchestrator | } 2026-04-07 00:02:28.509402 | orchestrator | 2026-04-07 00:02:28.509405 | orchestrator | # openstack_compute_keypair_v2.key will be created 2026-04-07 00:02:28.509409 | orchestrator | + resource "openstack_compute_keypair_v2" "key" { 2026-04-07 00:02:28.509413 | orchestrator | + fingerprint = (known after apply) 2026-04-07 00:02:28.509417 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509420 | orchestrator | + name = "testbed" 2026-04-07 00:02:28.509424 | orchestrator | + private_key = (sensitive value) 2026-04-07 00:02:28.509428 | orchestrator | + public_key = (known after apply) 2026-04-07 00:02:28.509431 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509435 | orchestrator | + user_id = (known after apply) 2026-04-07 00:02:28.509439 | orchestrator | } 2026-04-07 00:02:28.509442 | orchestrator | 2026-04-07 00:02:28.509446 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2026-04-07 00:02:28.509450 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509457 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509460 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509464 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509468 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509474 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509478 | orchestrator | } 2026-04-07 00:02:28.509481 | orchestrator | 2026-04-07 00:02:28.509485 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2026-04-07 00:02:28.509489 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509493 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509496 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509500 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509504 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509507 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509511 | orchestrator | } 2026-04-07 00:02:28.509515 | orchestrator | 2026-04-07 00:02:28.509519 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2026-04-07 00:02:28.509522 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509526 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509530 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509534 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509537 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509541 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509545 | orchestrator | } 2026-04-07 00:02:28.509548 | orchestrator | 2026-04-07 00:02:28.509552 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2026-04-07 00:02:28.509556 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509560 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509563 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509567 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509571 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509574 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509578 | orchestrator | } 2026-04-07 00:02:28.509582 | orchestrator | 2026-04-07 00:02:28.509586 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2026-04-07 00:02:28.509589 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509593 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509597 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509600 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509604 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509608 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509612 | orchestrator | } 2026-04-07 00:02:28.509615 | orchestrator | 2026-04-07 00:02:28.509619 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2026-04-07 00:02:28.509623 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509626 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509630 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509634 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509637 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509641 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509645 | orchestrator | } 2026-04-07 00:02:28.509648 | orchestrator | 2026-04-07 00:02:28.509652 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2026-04-07 00:02:28.509656 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509660 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509663 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509667 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509671 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509677 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509681 | orchestrator | } 2026-04-07 00:02:28.509685 | orchestrator | 2026-04-07 00:02:28.509688 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2026-04-07 00:02:28.509692 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509698 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509702 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509706 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509710 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509713 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509717 | orchestrator | } 2026-04-07 00:02:28.509721 | orchestrator | 2026-04-07 00:02:28.509725 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2026-04-07 00:02:28.509728 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-07 00:02:28.509732 | orchestrator | + device = (known after apply) 2026-04-07 00:02:28.509736 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509740 | orchestrator | + instance_id = (known after apply) 2026-04-07 00:02:28.509743 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509747 | orchestrator | + volume_id = (known after apply) 2026-04-07 00:02:28.509751 | orchestrator | } 2026-04-07 00:02:28.509754 | orchestrator | 2026-04-07 00:02:28.509758 | orchestrator | # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2026-04-07 00:02:28.509763 | orchestrator | + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2026-04-07 00:02:28.509766 | orchestrator | + fixed_ip = (known after apply) 2026-04-07 00:02:28.509770 | orchestrator | + floating_ip = (known after apply) 2026-04-07 00:02:28.509774 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509777 | orchestrator | + port_id = (known after apply) 2026-04-07 00:02:28.509781 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509785 | orchestrator | } 2026-04-07 00:02:28.509788 | orchestrator | 2026-04-07 00:02:28.509792 | orchestrator | # openstack_networking_floatingip_v2.manager_floating_ip will be created 2026-04-07 00:02:28.509796 | orchestrator | + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2026-04-07 00:02:28.509800 | orchestrator | + address = (known after apply) 2026-04-07 00:02:28.509803 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.509809 | orchestrator | + dns_domain = (known after apply) 2026-04-07 00:02:28.509813 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.509817 | orchestrator | + fixed_ip = (known after apply) 2026-04-07 00:02:28.509821 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509824 | orchestrator | + pool = "public" 2026-04-07 00:02:28.509828 | orchestrator | + port_id = (known after apply) 2026-04-07 00:02:28.509832 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509835 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.509839 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.509843 | orchestrator | } 2026-04-07 00:02:28.509846 | orchestrator | 2026-04-07 00:02:28.509850 | orchestrator | # openstack_networking_network_v2.net_management will be created 2026-04-07 00:02:28.509854 | orchestrator | + resource "openstack_networking_network_v2" "net_management" { 2026-04-07 00:02:28.509858 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.509861 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.509865 | orchestrator | + availability_zone_hints = [ 2026-04-07 00:02:28.509869 | orchestrator | + "nova", 2026-04-07 00:02:28.509873 | orchestrator | ] 2026-04-07 00:02:28.509876 | orchestrator | + dns_domain = (known after apply) 2026-04-07 00:02:28.509880 | orchestrator | + external = (known after apply) 2026-04-07 00:02:28.509884 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509887 | orchestrator | + mtu = (known after apply) 2026-04-07 00:02:28.509891 | orchestrator | + name = "net-testbed-management" 2026-04-07 00:02:28.509895 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.509901 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.509905 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.509909 | orchestrator | + shared = (known after apply) 2026-04-07 00:02:28.509912 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.509916 | orchestrator | + transparent_vlan = (known after apply) 2026-04-07 00:02:28.509920 | orchestrator | 2026-04-07 00:02:28.509924 | orchestrator | + segments (known after apply) 2026-04-07 00:02:28.509927 | orchestrator | } 2026-04-07 00:02:28.509931 | orchestrator | 2026-04-07 00:02:28.509935 | orchestrator | # openstack_networking_port_v2.manager_port_management will be created 2026-04-07 00:02:28.509939 | orchestrator | + resource "openstack_networking_port_v2" "manager_port_management" { 2026-04-07 00:02:28.509942 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.509946 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.509950 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.509953 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.509957 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.509961 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.509964 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.509968 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.509972 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.509976 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.509979 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.509983 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.509987 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510033 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510038 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510042 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510045 | orchestrator | 2026-04-07 00:02:28.510049 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510053 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510057 | orchestrator | } 2026-04-07 00:02:28.510060 | orchestrator | 2026-04-07 00:02:28.510064 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510068 | orchestrator | 2026-04-07 00:02:28.510072 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510075 | orchestrator | + ip_address = "192.168.16.5" 2026-04-07 00:02:28.510079 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510083 | orchestrator | } 2026-04-07 00:02:28.510087 | orchestrator | } 2026-04-07 00:02:28.510090 | orchestrator | 2026-04-07 00:02:28.510094 | orchestrator | # openstack_networking_port_v2.node_port_management[0] will be created 2026-04-07 00:02:28.510098 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-07 00:02:28.510101 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.510105 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.510109 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.510116 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.510120 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.510124 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.510127 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.510131 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.510135 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510138 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.510142 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.510146 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.510149 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510153 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510163 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510166 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510170 | orchestrator | 2026-04-07 00:02:28.510174 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510178 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-07 00:02:28.510181 | orchestrator | } 2026-04-07 00:02:28.510185 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510189 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510193 | orchestrator | } 2026-04-07 00:02:28.510196 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510200 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-07 00:02:28.510204 | orchestrator | } 2026-04-07 00:02:28.510208 | orchestrator | 2026-04-07 00:02:28.510211 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510215 | orchestrator | 2026-04-07 00:02:28.510219 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510223 | orchestrator | + ip_address = "192.168.16.10" 2026-04-07 00:02:28.510226 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510230 | orchestrator | } 2026-04-07 00:02:28.510234 | orchestrator | } 2026-04-07 00:02:28.510238 | orchestrator | 2026-04-07 00:02:28.510241 | orchestrator | # openstack_networking_port_v2.node_port_management[1] will be created 2026-04-07 00:02:28.510245 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-07 00:02:28.510251 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.510255 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.510259 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.510263 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.510267 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.510270 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.510274 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.510278 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.510281 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510285 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.510289 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.510293 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.510296 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510300 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510304 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510307 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510311 | orchestrator | 2026-04-07 00:02:28.510315 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510319 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-07 00:02:28.510322 | orchestrator | } 2026-04-07 00:02:28.510326 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510330 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510333 | orchestrator | } 2026-04-07 00:02:28.510337 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510341 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-07 00:02:28.510345 | orchestrator | } 2026-04-07 00:02:28.510348 | orchestrator | 2026-04-07 00:02:28.510352 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510356 | orchestrator | 2026-04-07 00:02:28.510360 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510363 | orchestrator | + ip_address = "192.168.16.11" 2026-04-07 00:02:28.510367 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510371 | orchestrator | } 2026-04-07 00:02:28.510375 | orchestrator | } 2026-04-07 00:02:28.510378 | orchestrator | 2026-04-07 00:02:28.510382 | orchestrator | # openstack_networking_port_v2.node_port_management[2] will be created 2026-04-07 00:02:28.510386 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-07 00:02:28.510389 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.510393 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.510397 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.510401 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.510407 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.510411 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.510415 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.510419 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.510422 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510426 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.510430 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.510433 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.510437 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510441 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510445 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510448 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510452 | orchestrator | 2026-04-07 00:02:28.510456 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510459 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-07 00:02:28.510463 | orchestrator | } 2026-04-07 00:02:28.510467 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510471 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510474 | orchestrator | } 2026-04-07 00:02:28.510478 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510482 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-07 00:02:28.510486 | orchestrator | } 2026-04-07 00:02:28.510489 | orchestrator | 2026-04-07 00:02:28.510493 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510497 | orchestrator | 2026-04-07 00:02:28.510501 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510504 | orchestrator | + ip_address = "192.168.16.12" 2026-04-07 00:02:28.510508 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510512 | orchestrator | } 2026-04-07 00:02:28.510515 | orchestrator | } 2026-04-07 00:02:28.510519 | orchestrator | 2026-04-07 00:02:28.510523 | orchestrator | # openstack_networking_port_v2.node_port_management[3] will be created 2026-04-07 00:02:28.510527 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-07 00:02:28.510533 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.510537 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.510541 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.510545 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.510549 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.510552 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.510556 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.510560 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.510563 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510567 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.510571 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.510574 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.510578 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510582 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510585 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510589 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510593 | orchestrator | 2026-04-07 00:02:28.510597 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510600 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-07 00:02:28.510604 | orchestrator | } 2026-04-07 00:02:28.510608 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510611 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510615 | orchestrator | } 2026-04-07 00:02:28.510619 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510623 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-07 00:02:28.510626 | orchestrator | } 2026-04-07 00:02:28.510630 | orchestrator | 2026-04-07 00:02:28.510637 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510640 | orchestrator | 2026-04-07 00:02:28.510644 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510648 | orchestrator | + ip_address = "192.168.16.13" 2026-04-07 00:02:28.510652 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510655 | orchestrator | } 2026-04-07 00:02:28.510659 | orchestrator | } 2026-04-07 00:02:28.510663 | orchestrator | 2026-04-07 00:02:28.510667 | orchestrator | # openstack_networking_port_v2.node_port_management[4] will be created 2026-04-07 00:02:28.510670 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-07 00:02:28.510674 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.510678 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.510682 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.510685 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.510689 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.510693 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.510696 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.510700 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.510706 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510710 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.510714 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.510717 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.510721 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510725 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510729 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510732 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510736 | orchestrator | 2026-04-07 00:02:28.510740 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510746 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-07 00:02:28.510750 | orchestrator | } 2026-04-07 00:02:28.510754 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510757 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510761 | orchestrator | } 2026-04-07 00:02:28.510765 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510768 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-07 00:02:28.510772 | orchestrator | } 2026-04-07 00:02:28.510776 | orchestrator | 2026-04-07 00:02:28.510780 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510783 | orchestrator | 2026-04-07 00:02:28.510787 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510791 | orchestrator | + ip_address = "192.168.16.14" 2026-04-07 00:02:28.510795 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510798 | orchestrator | } 2026-04-07 00:02:28.510802 | orchestrator | } 2026-04-07 00:02:28.510806 | orchestrator | 2026-04-07 00:02:28.510809 | orchestrator | # openstack_networking_port_v2.node_port_management[5] will be created 2026-04-07 00:02:28.510813 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-07 00:02:28.510817 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.510821 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-07 00:02:28.510824 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-07 00:02:28.510828 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.510832 | orchestrator | + device_id = (known after apply) 2026-04-07 00:02:28.510835 | orchestrator | + device_owner = (known after apply) 2026-04-07 00:02:28.510839 | orchestrator | + dns_assignment = (known after apply) 2026-04-07 00:02:28.510843 | orchestrator | + dns_name = (known after apply) 2026-04-07 00:02:28.510846 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510850 | orchestrator | + mac_address = (known after apply) 2026-04-07 00:02:28.510854 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.510857 | orchestrator | + port_security_enabled = (known after apply) 2026-04-07 00:02:28.510861 | orchestrator | + qos_policy_id = (known after apply) 2026-04-07 00:02:28.510867 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510871 | orchestrator | + security_group_ids = (known after apply) 2026-04-07 00:02:28.510875 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.510879 | orchestrator | 2026-04-07 00:02:28.510882 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510886 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-07 00:02:28.510890 | orchestrator | } 2026-04-07 00:02:28.510893 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510897 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-07 00:02:28.510901 | orchestrator | } 2026-04-07 00:02:28.510905 | orchestrator | + allowed_address_pairs { 2026-04-07 00:02:28.510908 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-07 00:02:28.510912 | orchestrator | } 2026-04-07 00:02:28.510916 | orchestrator | 2026-04-07 00:02:28.510920 | orchestrator | + binding (known after apply) 2026-04-07 00:02:28.510923 | orchestrator | 2026-04-07 00:02:28.510927 | orchestrator | + fixed_ip { 2026-04-07 00:02:28.510931 | orchestrator | + ip_address = "192.168.16.15" 2026-04-07 00:02:28.510935 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510938 | orchestrator | } 2026-04-07 00:02:28.510942 | orchestrator | } 2026-04-07 00:02:28.510946 | orchestrator | 2026-04-07 00:02:28.510952 | orchestrator | # openstack_networking_router_interface_v2.router_interface will be created 2026-04-07 00:02:28.510956 | orchestrator | + resource "openstack_networking_router_interface_v2" "router_interface" { 2026-04-07 00:02:28.510960 | orchestrator | + force_destroy = false 2026-04-07 00:02:28.510964 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.510967 | orchestrator | + port_id = (known after apply) 2026-04-07 00:02:28.510971 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.510975 | orchestrator | + router_id = (known after apply) 2026-04-07 00:02:28.510978 | orchestrator | + subnet_id = (known after apply) 2026-04-07 00:02:28.510982 | orchestrator | } 2026-04-07 00:02:28.510986 | orchestrator | 2026-04-07 00:02:28.511002 | orchestrator | # openstack_networking_router_v2.router will be created 2026-04-07 00:02:28.511006 | orchestrator | + resource "openstack_networking_router_v2" "router" { 2026-04-07 00:02:28.511010 | orchestrator | + admin_state_up = (known after apply) 2026-04-07 00:02:28.511014 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.511017 | orchestrator | + availability_zone_hints = [ 2026-04-07 00:02:28.511021 | orchestrator | + "nova", 2026-04-07 00:02:28.511025 | orchestrator | ] 2026-04-07 00:02:28.511029 | orchestrator | + distributed = (known after apply) 2026-04-07 00:02:28.511032 | orchestrator | + enable_snat = (known after apply) 2026-04-07 00:02:28.511036 | orchestrator | + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2026-04-07 00:02:28.511040 | orchestrator | + external_qos_policy_id = (known after apply) 2026-04-07 00:02:28.511043 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511047 | orchestrator | + name = "testbed" 2026-04-07 00:02:28.511051 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511055 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511058 | orchestrator | 2026-04-07 00:02:28.511062 | orchestrator | + external_fixed_ip (known after apply) 2026-04-07 00:02:28.511066 | orchestrator | } 2026-04-07 00:02:28.511070 | orchestrator | 2026-04-07 00:02:28.511073 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2026-04-07 00:02:28.511078 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2026-04-07 00:02:28.511082 | orchestrator | + description = "ssh" 2026-04-07 00:02:28.511085 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511089 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511093 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511097 | orchestrator | + port_range_max = 22 2026-04-07 00:02:28.511100 | orchestrator | + port_range_min = 22 2026-04-07 00:02:28.511104 | orchestrator | + protocol = "tcp" 2026-04-07 00:02:28.511108 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511114 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511118 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511122 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511126 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511129 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511133 | orchestrator | } 2026-04-07 00:02:28.511137 | orchestrator | 2026-04-07 00:02:28.511141 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2026-04-07 00:02:28.511144 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2026-04-07 00:02:28.511148 | orchestrator | + description = "wireguard" 2026-04-07 00:02:28.511152 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511155 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511159 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511163 | orchestrator | + port_range_max = 51820 2026-04-07 00:02:28.511167 | orchestrator | + port_range_min = 51820 2026-04-07 00:02:28.511170 | orchestrator | + protocol = "udp" 2026-04-07 00:02:28.511174 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511178 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511182 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511185 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511189 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511193 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511197 | orchestrator | } 2026-04-07 00:02:28.511200 | orchestrator | 2026-04-07 00:02:28.511204 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2026-04-07 00:02:28.511208 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2026-04-07 00:02:28.511214 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511218 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511221 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511225 | orchestrator | + protocol = "tcp" 2026-04-07 00:02:28.511229 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511233 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511236 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511240 | orchestrator | + remote_ip_prefix = "192.168.16.0/20" 2026-04-07 00:02:28.511244 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511247 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511251 | orchestrator | } 2026-04-07 00:02:28.511255 | orchestrator | 2026-04-07 00:02:28.511259 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2026-04-07 00:02:28.511263 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2026-04-07 00:02:28.511266 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511270 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511274 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511277 | orchestrator | + protocol = "udp" 2026-04-07 00:02:28.511281 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511285 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511289 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511292 | orchestrator | + remote_ip_prefix = "192.168.16.0/20" 2026-04-07 00:02:28.511296 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511302 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511306 | orchestrator | } 2026-04-07 00:02:28.511310 | orchestrator | 2026-04-07 00:02:28.511314 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2026-04-07 00:02:28.511320 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2026-04-07 00:02:28.511324 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511328 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511332 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511335 | orchestrator | + protocol = "icmp" 2026-04-07 00:02:28.511339 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511343 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511346 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511350 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511354 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511358 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511362 | orchestrator | } 2026-04-07 00:02:28.511365 | orchestrator | 2026-04-07 00:02:28.511369 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2026-04-07 00:02:28.511373 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2026-04-07 00:02:28.511377 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511380 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511384 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511388 | orchestrator | + protocol = "tcp" 2026-04-07 00:02:28.511391 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511395 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511399 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511403 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511406 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511410 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511414 | orchestrator | } 2026-04-07 00:02:28.511418 | orchestrator | 2026-04-07 00:02:28.511421 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2026-04-07 00:02:28.511425 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2026-04-07 00:02:28.511429 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511433 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511436 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511440 | orchestrator | + protocol = "udp" 2026-04-07 00:02:28.511444 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511448 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511451 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511455 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511459 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511462 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511466 | orchestrator | } 2026-04-07 00:02:28.511470 | orchestrator | 2026-04-07 00:02:28.511474 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2026-04-07 00:02:28.511477 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2026-04-07 00:02:28.511481 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511485 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511489 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511492 | orchestrator | + protocol = "icmp" 2026-04-07 00:02:28.511496 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511500 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511503 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511507 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511511 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511514 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511521 | orchestrator | } 2026-04-07 00:02:28.511525 | orchestrator | 2026-04-07 00:02:28.511529 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2026-04-07 00:02:28.511532 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2026-04-07 00:02:28.511536 | orchestrator | + description = "vrrp" 2026-04-07 00:02:28.511540 | orchestrator | + direction = "ingress" 2026-04-07 00:02:28.511543 | orchestrator | + ethertype = "IPv4" 2026-04-07 00:02:28.511547 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511551 | orchestrator | + protocol = "112" 2026-04-07 00:02:28.511555 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511558 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-07 00:02:28.511562 | orchestrator | + remote_group_id = (known after apply) 2026-04-07 00:02:28.511566 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-07 00:02:28.511569 | orchestrator | + security_group_id = (known after apply) 2026-04-07 00:02:28.511573 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511577 | orchestrator | } 2026-04-07 00:02:28.511581 | orchestrator | 2026-04-07 00:02:28.511584 | orchestrator | # openstack_networking_secgroup_v2.security_group_management will be created 2026-04-07 00:02:28.511588 | orchestrator | + resource "openstack_networking_secgroup_v2" "security_group_management" { 2026-04-07 00:02:28.511592 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.511596 | orchestrator | + description = "management security group" 2026-04-07 00:02:28.511600 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511603 | orchestrator | + name = "testbed-management" 2026-04-07 00:02:28.511607 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511611 | orchestrator | + stateful = (known after apply) 2026-04-07 00:02:28.511614 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511618 | orchestrator | } 2026-04-07 00:02:28.511622 | orchestrator | 2026-04-07 00:02:28.511626 | orchestrator | # openstack_networking_secgroup_v2.security_group_node will be created 2026-04-07 00:02:28.511629 | orchestrator | + resource "openstack_networking_secgroup_v2" "security_group_node" { 2026-04-07 00:02:28.511633 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.511639 | orchestrator | + description = "node security group" 2026-04-07 00:02:28.511643 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511647 | orchestrator | + name = "testbed-node" 2026-04-07 00:02:28.511651 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511654 | orchestrator | + stateful = (known after apply) 2026-04-07 00:02:28.511658 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511662 | orchestrator | } 2026-04-07 00:02:28.511665 | orchestrator | 2026-04-07 00:02:28.511669 | orchestrator | # openstack_networking_subnet_v2.subnet_management will be created 2026-04-07 00:02:28.511673 | orchestrator | + resource "openstack_networking_subnet_v2" "subnet_management" { 2026-04-07 00:02:28.511677 | orchestrator | + all_tags = (known after apply) 2026-04-07 00:02:28.511680 | orchestrator | + cidr = "192.168.16.0/20" 2026-04-07 00:02:28.511684 | orchestrator | + dns_nameservers = [ 2026-04-07 00:02:28.511688 | orchestrator | + "8.8.8.8", 2026-04-07 00:02:28.511692 | orchestrator | + "9.9.9.9", 2026-04-07 00:02:28.511695 | orchestrator | ] 2026-04-07 00:02:28.511699 | orchestrator | + enable_dhcp = true 2026-04-07 00:02:28.511703 | orchestrator | + gateway_ip = (known after apply) 2026-04-07 00:02:28.511709 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511713 | orchestrator | + ip_version = 4 2026-04-07 00:02:28.511717 | orchestrator | + ipv6_address_mode = (known after apply) 2026-04-07 00:02:28.511720 | orchestrator | + ipv6_ra_mode = (known after apply) 2026-04-07 00:02:28.511724 | orchestrator | + name = "subnet-testbed-management" 2026-04-07 00:02:28.511728 | orchestrator | + network_id = (known after apply) 2026-04-07 00:02:28.511732 | orchestrator | + no_gateway = false 2026-04-07 00:02:28.511735 | orchestrator | + region = (known after apply) 2026-04-07 00:02:28.511739 | orchestrator | + service_types = (known after apply) 2026-04-07 00:02:28.511745 | orchestrator | + tenant_id = (known after apply) 2026-04-07 00:02:28.511749 | orchestrator | 2026-04-07 00:02:28.511753 | orchestrator | + allocation_pool { 2026-04-07 00:02:28.511757 | orchestrator | + end = "192.168.31.250" 2026-04-07 00:02:28.511760 | orchestrator | + start = "192.168.31.200" 2026-04-07 00:02:28.511764 | orchestrator | } 2026-04-07 00:02:28.511768 | orchestrator | } 2026-04-07 00:02:28.511772 | orchestrator | 2026-04-07 00:02:28.511775 | orchestrator | # terraform_data.image will be created 2026-04-07 00:02:28.511779 | orchestrator | + resource "terraform_data" "image" { 2026-04-07 00:02:28.511783 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511787 | orchestrator | + input = "Ubuntu 24.04" 2026-04-07 00:02:28.511790 | orchestrator | + output = (known after apply) 2026-04-07 00:02:28.511794 | orchestrator | } 2026-04-07 00:02:28.511798 | orchestrator | 2026-04-07 00:02:28.511801 | orchestrator | # terraform_data.image_node will be created 2026-04-07 00:02:28.511805 | orchestrator | + resource "terraform_data" "image_node" { 2026-04-07 00:02:28.511809 | orchestrator | + id = (known after apply) 2026-04-07 00:02:28.511813 | orchestrator | + input = "Ubuntu 24.04" 2026-04-07 00:02:28.511816 | orchestrator | + output = (known after apply) 2026-04-07 00:02:28.511820 | orchestrator | } 2026-04-07 00:02:28.511824 | orchestrator | 2026-04-07 00:02:28.511828 | orchestrator | Plan: 64 to add, 0 to change, 0 to destroy. 2026-04-07 00:02:28.511831 | orchestrator | 2026-04-07 00:02:28.511835 | orchestrator | Changes to Outputs: 2026-04-07 00:02:28.511839 | orchestrator | + manager_address = (sensitive value) 2026-04-07 00:02:28.511843 | orchestrator | + private_key = (sensitive value) 2026-04-07 00:02:28.658120 | orchestrator | terraform_data.image_node: Creating... 2026-04-07 00:02:28.658178 | orchestrator | terraform_data.image_node: Creation complete after 0s [id=712a2c4a-6091-0f36-56dc-818cc8aef3a2] 2026-04-07 00:02:28.942130 | orchestrator | terraform_data.image: Creating... 2026-04-07 00:02:28.942187 | orchestrator | terraform_data.image: Creation complete after 0s [id=1220bae7-13e8-4e90-4915-43709b8a3c82] 2026-04-07 00:02:28.946163 | orchestrator | data.openstack_images_image_v2.image_node: Reading... 2026-04-07 00:02:28.946206 | orchestrator | data.openstack_images_image_v2.image: Reading... 2026-04-07 00:02:28.956890 | orchestrator | openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2026-04-07 00:02:28.958237 | orchestrator | openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2026-04-07 00:02:28.958647 | orchestrator | openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2026-04-07 00:02:28.962060 | orchestrator | openstack_compute_keypair_v2.key: Creating... 2026-04-07 00:02:28.962093 | orchestrator | openstack_networking_network_v2.net_management: Creating... 2026-04-07 00:02:28.964702 | orchestrator | openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2026-04-07 00:02:28.969758 | orchestrator | openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2026-04-07 00:02:28.984812 | orchestrator | openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2026-04-07 00:02:29.422758 | orchestrator | data.openstack_images_image_v2.image_node: Read complete after 0s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2026-04-07 00:02:29.432111 | orchestrator | openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2026-04-07 00:02:29.450701 | orchestrator | data.openstack_images_image_v2.image: Read complete after 0s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2026-04-07 00:02:29.454868 | orchestrator | openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2026-04-07 00:02:29.485854 | orchestrator | openstack_compute_keypair_v2.key: Creation complete after 0s [id=testbed] 2026-04-07 00:02:29.494046 | orchestrator | openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2026-04-07 00:02:30.262759 | orchestrator | openstack_networking_network_v2.net_management: Creation complete after 1s [id=be831a0d-6c85-4b67-a3ff-4f8fca772654] 2026-04-07 00:02:30.269058 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2026-04-07 00:02:32.910281 | orchestrator | openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 4s [id=0aceb24c-1141-4b89-81c4-2bd069400a76] 2026-04-07 00:02:32.914939 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2026-04-07 00:02:32.937534 | orchestrator | openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 4s [id=d98a6229-64c7-4f26-837e-eda0f824cf1d] 2026-04-07 00:02:32.943736 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2026-04-07 00:02:32.967981 | orchestrator | openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 4s [id=61826d0c-ccdc-4393-b392-5dc26cd19349] 2026-04-07 00:02:32.995368 | orchestrator | openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 4s [id=967b79e7-41ef-439c-974d-46e00c7544ba] 2026-04-07 00:02:32.995425 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2026-04-07 00:02:33.012209 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2026-04-07 00:02:33.026655 | orchestrator | openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 4s [id=d9b6b982-5d2c-47ad-95ce-6e4d358a27cd] 2026-04-07 00:02:33.026778 | orchestrator | openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 4s [id=1469229d-4b75-4251-a9b8-5b75cda4a696] 2026-04-07 00:02:33.028515 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2026-04-07 00:02:33.032967 | orchestrator | openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2026-04-07 00:02:33.048186 | orchestrator | openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 4s [id=ee2515b7-1de0-4cb8-a492-67bb0415ec88] 2026-04-07 00:02:33.057741 | orchestrator | local_file.id_rsa_pub: Creating... 2026-04-07 00:02:33.063247 | orchestrator | local_file.id_rsa_pub: Creation complete after 0s [id=5e70ca80720fe325253e02f74ec1bd6ed7d04416] 2026-04-07 00:02:33.077015 | orchestrator | local_sensitive_file.id_rsa: Creating... 2026-04-07 00:02:33.084156 | orchestrator | local_sensitive_file.id_rsa: Creation complete after 0s [id=fb55e29cb341c2d7c2045f999b7553222ce921ec] 2026-04-07 00:02:33.100806 | orchestrator | openstack_networking_subnet_v2.subnet_management: Creating... 2026-04-07 00:02:33.164288 | orchestrator | openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 4s [id=e06458de-fcc8-49b9-b479-fcb02169b5c8] 2026-04-07 00:02:33.228371 | orchestrator | openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 4s [id=18dce6fc-4f14-415a-9461-5b764394eff6] 2026-04-07 00:02:33.697097 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 4s [id=4b41c8c8-cb46-4d77-a16f-c33d82450db9] 2026-04-07 00:02:34.140323 | orchestrator | openstack_networking_subnet_v2.subnet_management: Creation complete after 1s [id=472c0fd9-81fc-4748-b09d-21cabced07fb] 2026-04-07 00:02:34.146794 | orchestrator | openstack_networking_router_v2.router: Creating... 2026-04-07 00:02:36.471102 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 3s [id=d03c7bab-3b09-492f-89a4-e7206370e450] 2026-04-07 00:02:36.592482 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 4s [id=5f996502-1da9-49e5-9e1f-f2a253186967] 2026-04-07 00:02:36.627510 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 4s [id=3bcc4df9-bd9b-42b2-a91b-3b385d323e82] 2026-04-07 00:02:36.724733 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 4s [id=0c3908cd-9ae7-4f32-bcd9-16913c42debe] 2026-04-07 00:02:36.790446 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 4s [id=b9ec3702-6661-45b7-a0cf-93bf4acfc295] 2026-04-07 00:02:36.820839 | orchestrator | openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 4s [id=54f3b5f8-7f6f-41ab-9479-d68fd18728a4] 2026-04-07 00:02:37.495250 | orchestrator | openstack_networking_router_v2.router: Creation complete after 3s [id=8b4f026f-56f3-4e2b-a3f2-f1505074c7a2] 2026-04-07 00:02:37.509834 | orchestrator | openstack_networking_secgroup_v2.security_group_management: Creating... 2026-04-07 00:02:37.510229 | orchestrator | openstack_networking_router_interface_v2.router_interface: Creating... 2026-04-07 00:02:37.511607 | orchestrator | openstack_networking_secgroup_v2.security_group_node: Creating... 2026-04-07 00:02:37.747161 | orchestrator | openstack_networking_secgroup_v2.security_group_management: Creation complete after 0s [id=75207632-4530-40f3-af9e-39d1fa0139d1] 2026-04-07 00:02:37.757263 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2026-04-07 00:02:37.757374 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2026-04-07 00:02:37.758520 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2026-04-07 00:02:37.758714 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2026-04-07 00:02:37.764924 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2026-04-07 00:02:37.765310 | orchestrator | openstack_networking_port_v2.manager_port_management: Creating... 2026-04-07 00:02:37.821347 | orchestrator | openstack_networking_secgroup_v2.security_group_node: Creation complete after 0s [id=55b2e72a-2e1f-4f22-9333-d54514eb6e6b] 2026-04-07 00:02:37.826300 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2026-04-07 00:02:37.826543 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2026-04-07 00:02:37.829260 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2026-04-07 00:02:38.104732 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 0s [id=5edbe8f2-7a65-4fae-a247-fc24bfa60b99] 2026-04-07 00:02:38.124779 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 0s [id=ada20a53-1c47-40c4-85a3-177dc89ec48d] 2026-04-07 00:02:38.125188 | orchestrator | openstack_networking_port_v2.node_port_management[5]: Creating... 2026-04-07 00:02:38.133258 | orchestrator | openstack_networking_port_v2.node_port_management[3]: Creating... 2026-04-07 00:02:38.492225 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 0s [id=497d2bf5-e8bd-47b5-b335-39e360084e09] 2026-04-07 00:02:38.497347 | orchestrator | openstack_networking_port_v2.node_port_management[1]: Creating... 2026-04-07 00:02:39.154219 | orchestrator | openstack_networking_port_v2.manager_port_management: Creation complete after 1s [id=dc25f09a-0883-4c56-a9f1-b26f54744cb4] 2026-04-07 00:02:39.160785 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2026-04-07 00:02:39.240013 | orchestrator | openstack_networking_port_v2.node_port_management[3]: Creation complete after 1s [id=3ae34eab-344b-47c8-80d0-5f1ffb9db47a] 2026-04-07 00:02:39.253614 | orchestrator | openstack_networking_port_v2.node_port_management[2]: Creating... 2026-04-07 00:02:39.316242 | orchestrator | openstack_networking_port_v2.node_port_management[5]: Creation complete after 1s [id=4500d28f-617f-4d9a-8577-be05dcedd7a7] 2026-04-07 00:02:39.333020 | orchestrator | openstack_networking_port_v2.node_port_management[0]: Creating... 2026-04-07 00:02:39.370794 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 1s [id=eb15e8cc-d1fe-423e-abe3-babef2ddd0cf] 2026-04-07 00:02:39.389889 | orchestrator | openstack_networking_port_v2.node_port_management[4]: Creating... 2026-04-07 00:02:39.527803 | orchestrator | openstack_networking_port_v2.node_port_management[1]: Creation complete after 2s [id=c0c244a0-fd85-4fc9-8f48-f2c9c27d00c3] 2026-04-07 00:02:39.547160 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 2s [id=8a9de135-43e1-4cc2-9881-432688965994] 2026-04-07 00:02:39.853557 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 1s [id=c23c06e1-404f-4c4c-bb0e-e4b08a5dce9b] 2026-04-07 00:02:40.396625 | orchestrator | openstack_networking_port_v2.node_port_management[2]: Creation complete after 1s [id=c77eb69f-2367-487d-be5d-bac634e56067] 2026-04-07 00:02:40.409665 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 2s [id=f9aee435-f4b4-4d64-b5f8-bd6505be3360] 2026-04-07 00:02:40.776886 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 3s [id=5eff4c87-5da7-45e8-a67c-51a90dffea9a] 2026-04-07 00:02:41.069366 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 3s [id=2d05c24f-c37d-4e64-83ad-57473e2a5d0b] 2026-04-07 00:02:41.159612 | orchestrator | openstack_networking_router_interface_v2.router_interface: Creation complete after 3s [id=a5258c74-8aea-4bc5-8655-25b6f2d9ca16] 2026-04-07 00:02:41.166244 | orchestrator | openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2026-04-07 00:02:41.554290 | orchestrator | openstack_networking_port_v2.node_port_management[4]: Creation complete after 3s [id=1ec611f6-3585-4cf4-a202-9aa82078f808] 2026-04-07 00:02:42.097447 | orchestrator | openstack_networking_port_v2.node_port_management[0]: Creation complete after 3s [id=2efc7932-33c1-4954-96fc-f43d4a4cf0cd] 2026-04-07 00:02:42.136139 | orchestrator | openstack_compute_instance_v2.node_server[5]: Creating... 2026-04-07 00:02:42.144074 | orchestrator | openstack_compute_instance_v2.node_server[1]: Creating... 2026-04-07 00:02:42.147778 | orchestrator | openstack_compute_instance_v2.node_server[2]: Creating... 2026-04-07 00:02:42.149847 | orchestrator | openstack_compute_instance_v2.node_server[0]: Creating... 2026-04-07 00:02:42.158120 | orchestrator | openstack_compute_instance_v2.node_server[4]: Creating... 2026-04-07 00:02:42.158174 | orchestrator | openstack_compute_instance_v2.node_server[3]: Creating... 2026-04-07 00:02:42.587399 | orchestrator | openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 2s [id=626f8b5c-3a6f-4237-9f27-75c7467a8c84] 2026-04-07 00:02:42.599784 | orchestrator | openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2026-04-07 00:02:42.602667 | orchestrator | local_file.MANAGER_ADDRESS: Creating... 2026-04-07 00:02:42.605442 | orchestrator | local_file.inventory: Creating... 2026-04-07 00:02:42.610527 | orchestrator | local_file.inventory: Creation complete after 0s [id=9d72629d4112ca9f1ef4d32b9599eab7c195ba00] 2026-04-07 00:02:42.612111 | orchestrator | local_file.MANAGER_ADDRESS: Creation complete after 0s [id=f1200c2153aadd089b4b8c815d4d038f43c0b19b] 2026-04-07 00:02:44.716065 | orchestrator | openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 2s [id=626f8b5c-3a6f-4237-9f27-75c7467a8c84] 2026-04-07 00:02:52.140556 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2026-04-07 00:02:52.143797 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2026-04-07 00:02:52.148970 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2026-04-07 00:02:52.159238 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2026-04-07 00:02:52.160356 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2026-04-07 00:02:52.161525 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2026-04-07 00:03:02.148647 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2026-04-07 00:03:02.148746 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2026-04-07 00:03:02.149802 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2026-04-07 00:03:02.160131 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2026-04-07 00:03:02.161431 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2026-04-07 00:03:02.162729 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2026-04-07 00:03:12.156720 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2026-04-07 00:03:12.156834 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [30s elapsed] 2026-04-07 00:03:12.156863 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [30s elapsed] 2026-04-07 00:03:12.160932 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [30s elapsed] 2026-04-07 00:03:12.162247 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [30s elapsed] 2026-04-07 00:03:12.163456 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [30s elapsed] 2026-04-07 00:03:13.264091 | orchestrator | openstack_compute_instance_v2.node_server[2]: Creation complete after 31s [id=8a9152f8-37a0-418a-8f6c-2d84fd242ddd] 2026-04-07 00:03:22.164667 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [40s elapsed] 2026-04-07 00:03:22.164804 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [40s elapsed] 2026-04-07 00:03:22.164954 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [40s elapsed] 2026-04-07 00:03:22.165074 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [40s elapsed] 2026-04-07 00:03:22.165189 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [40s elapsed] 2026-04-07 00:03:22.983921 | orchestrator | openstack_compute_instance_v2.node_server[0]: Creation complete after 41s [id=c48b8c26-378b-4862-baba-d27a021d92de] 2026-04-07 00:03:23.159009 | orchestrator | openstack_compute_instance_v2.node_server[4]: Creation complete after 41s [id=9d78e07e-81ae-4395-a7cc-08061fb4c084] 2026-04-07 00:03:32.173326 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [50s elapsed] 2026-04-07 00:03:32.173484 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [50s elapsed] 2026-04-07 00:03:32.173516 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [50s elapsed] 2026-04-07 00:03:33.006795 | orchestrator | openstack_compute_instance_v2.node_server[5]: Creation complete after 51s [id=4dffb511-7fd7-4209-8590-61b5c594d7c1] 2026-04-07 00:03:33.147749 | orchestrator | openstack_compute_instance_v2.node_server[3]: Creation complete after 51s [id=f5baa031-c8a1-41be-8bc5-e62a96e08dc8] 2026-04-07 00:03:34.137178 | orchestrator | openstack_compute_instance_v2.node_server[1]: Creation complete after 52s [id=4a88dc0f-dad5-4a40-af93-938e8fb29216] 2026-04-07 00:03:34.159818 | orchestrator | null_resource.node_semaphore: Creating... 2026-04-07 00:03:34.164111 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2026-04-07 00:03:34.170350 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2026-04-07 00:03:34.172914 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2026-04-07 00:03:34.173911 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2026-04-07 00:03:34.174296 | orchestrator | null_resource.node_semaphore: Creation complete after 0s [id=9208270725685890079] 2026-04-07 00:03:34.174960 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2026-04-07 00:03:34.182202 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2026-04-07 00:03:34.199076 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2026-04-07 00:03:34.201535 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2026-04-07 00:03:34.203069 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2026-04-07 00:03:34.216993 | orchestrator | openstack_compute_instance_v2.manager_server: Creating... 2026-04-07 00:03:37.882363 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 4s [id=f5baa031-c8a1-41be-8bc5-e62a96e08dc8/d98a6229-64c7-4f26-837e-eda0f824cf1d] 2026-04-07 00:03:37.892395 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 4s [id=9d78e07e-81ae-4395-a7cc-08061fb4c084/1469229d-4b75-4251-a9b8-5b75cda4a696] 2026-04-07 00:03:37.921221 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 4s [id=4dffb511-7fd7-4209-8590-61b5c594d7c1/e06458de-fcc8-49b9-b479-fcb02169b5c8] 2026-04-07 00:03:38.012242 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 4s [id=4dffb511-7fd7-4209-8590-61b5c594d7c1/61826d0c-ccdc-4393-b392-5dc26cd19349] 2026-04-07 00:03:44.012522 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 10s [id=9d78e07e-81ae-4395-a7cc-08061fb4c084/18dce6fc-4f14-415a-9461-5b764394eff6] 2026-04-07 00:03:44.018618 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 10s [id=f5baa031-c8a1-41be-8bc5-e62a96e08dc8/ee2515b7-1de0-4cb8-a492-67bb0415ec88] 2026-04-07 00:03:44.117965 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 10s [id=f5baa031-c8a1-41be-8bc5-e62a96e08dc8/0aceb24c-1141-4b89-81c4-2bd069400a76] 2026-04-07 00:03:44.133245 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 10s [id=9d78e07e-81ae-4395-a7cc-08061fb4c084/967b79e7-41ef-439c-974d-46e00c7544ba] 2026-04-07 00:03:44.175245 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 10s [id=4dffb511-7fd7-4209-8590-61b5c594d7c1/d9b6b982-5d2c-47ad-95ce-6e4d358a27cd] 2026-04-07 00:03:44.222868 | orchestrator | openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2026-04-07 00:03:54.223578 | orchestrator | openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2026-04-07 00:03:54.610663 | orchestrator | openstack_compute_instance_v2.manager_server: Creation complete after 21s [id=a4ca6094-ec78-4e49-906c-c82e3f1dbd2b] 2026-04-07 00:03:54.634312 | orchestrator | 2026-04-07 00:03:54.634391 | orchestrator | Apply complete! Resources: 64 added, 0 changed, 0 destroyed. 2026-04-07 00:03:54.634411 | orchestrator | 2026-04-07 00:03:54.634419 | orchestrator | Outputs: 2026-04-07 00:03:54.634426 | orchestrator | 2026-04-07 00:03:54.634433 | orchestrator | manager_address = 2026-04-07 00:03:54.634441 | orchestrator | private_key = 2026-04-07 00:03:54.709446 | orchestrator | ok: Runtime: 0:01:32.254274 2026-04-07 00:03:54.730427 | 2026-04-07 00:03:54.730570 | TASK [Fetch manager address] 2026-04-07 00:03:55.231228 | orchestrator | ok 2026-04-07 00:03:55.239018 | 2026-04-07 00:03:55.239132 | TASK [Set manager_host address] 2026-04-07 00:03:55.313911 | orchestrator | ok 2026-04-07 00:03:55.322351 | 2026-04-07 00:03:55.322513 | LOOP [Update ansible collections] 2026-04-07 00:03:56.549277 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-07 00:03:56.549565 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2026-04-07 00:03:56.549620 | orchestrator | Starting galaxy collection install process 2026-04-07 00:03:56.549646 | orchestrator | Process install dependency map 2026-04-07 00:03:56.549668 | orchestrator | Starting collection install process 2026-04-07 00:03:56.549734 | orchestrator | Installing 'osism.commons:999.0.0' to '/home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons' 2026-04-07 00:03:56.549775 | orchestrator | Created collection for osism.commons:999.0.0 at /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons 2026-04-07 00:03:56.549842 | orchestrator | osism.commons:999.0.0 was installed successfully 2026-04-07 00:03:56.549922 | orchestrator | ok: Item: commons Runtime: 0:00:00.833686 2026-04-07 00:03:57.856255 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-07 00:03:57.856481 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2026-04-07 00:03:57.856539 | orchestrator | Starting galaxy collection install process 2026-04-07 00:03:57.856579 | orchestrator | Process install dependency map 2026-04-07 00:03:57.856616 | orchestrator | Starting collection install process 2026-04-07 00:03:57.856651 | orchestrator | Installing 'osism.services:999.0.0' to '/home/zuul-testbed02/.ansible/collections/ansible_collections/osism/services' 2026-04-07 00:03:57.856767 | orchestrator | Created collection for osism.services:999.0.0 at /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/services 2026-04-07 00:03:57.856806 | orchestrator | osism.services:999.0.0 was installed successfully 2026-04-07 00:03:57.856858 | orchestrator | ok: Item: services Runtime: 0:00:00.955684 2026-04-07 00:03:57.883208 | 2026-04-07 00:03:57.883985 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2026-04-07 00:04:08.456188 | orchestrator | ok 2026-04-07 00:04:08.465061 | 2026-04-07 00:04:08.465176 | TASK [Wait a little longer for the manager so that everything is ready] 2026-04-07 00:05:08.509574 | orchestrator | ok 2026-04-07 00:05:08.519734 | 2026-04-07 00:05:08.519875 | TASK [Fetch manager ssh hostkey] 2026-04-07 00:05:10.100166 | orchestrator | Output suppressed because no_log was given 2026-04-07 00:05:10.114564 | 2026-04-07 00:05:10.114740 | TASK [Get ssh keypair from terraform environment] 2026-04-07 00:05:10.655293 | orchestrator | ok: Runtime: 0:00:00.008519 2026-04-07 00:05:10.672586 | 2026-04-07 00:05:10.672756 | TASK [Point out that the following task takes some time and does not give any output] 2026-04-07 00:05:10.723456 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2026-04-07 00:05:10.734877 | 2026-04-07 00:05:10.735040 | TASK [Run manager part 0] 2026-04-07 00:05:11.843930 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-07 00:05:11.907834 | orchestrator | 2026-04-07 00:05:11.907896 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2026-04-07 00:05:11.907909 | orchestrator | 2026-04-07 00:05:11.907931 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2026-04-07 00:05:13.692457 | orchestrator | ok: [testbed-manager] 2026-04-07 00:05:13.692491 | orchestrator | 2026-04-07 00:05:13.692512 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2026-04-07 00:05:13.692522 | orchestrator | 2026-04-07 00:05:13.692530 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:05:15.572691 | orchestrator | ok: [testbed-manager] 2026-04-07 00:05:15.572741 | orchestrator | 2026-04-07 00:05:15.572751 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2026-04-07 00:05:16.300266 | orchestrator | ok: [testbed-manager] 2026-04-07 00:05:16.300312 | orchestrator | 2026-04-07 00:05:16.300320 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2026-04-07 00:05:16.349364 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:05:16.349420 | orchestrator | 2026-04-07 00:05:16.349443 | orchestrator | TASK [Fail if Ubuntu version is lower than 24.04] ****************************** 2026-04-07 00:05:16.386233 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:05:16.386279 | orchestrator | 2026-04-07 00:05:16.386287 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2026-04-07 00:05:16.429051 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:05:16.429100 | orchestrator | 2026-04-07 00:05:16.429110 | orchestrator | TASK [Set APT options on manager] ********************************************** 2026-04-07 00:05:17.176782 | orchestrator | changed: [testbed-manager] 2026-04-07 00:05:17.176833 | orchestrator | 2026-04-07 00:05:17.176841 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2026-04-07 00:08:06.518507 | orchestrator | changed: [testbed-manager] 2026-04-07 00:08:06.518558 | orchestrator | 2026-04-07 00:08:06.518567 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2026-04-07 00:09:20.378463 | orchestrator | changed: [testbed-manager] 2026-04-07 00:09:20.378507 | orchestrator | 2026-04-07 00:09:20.378517 | orchestrator | TASK [Install required packages] *********************************************** 2026-04-07 00:09:43.542505 | orchestrator | changed: [testbed-manager] 2026-04-07 00:09:43.542603 | orchestrator | 2026-04-07 00:09:43.542621 | orchestrator | TASK [Remove some python packages] ********************************************* 2026-04-07 00:09:52.222112 | orchestrator | changed: [testbed-manager] 2026-04-07 00:09:52.222299 | orchestrator | 2026-04-07 00:09:52.222319 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2026-04-07 00:09:52.266366 | orchestrator | ok: [testbed-manager] 2026-04-07 00:09:52.266454 | orchestrator | 2026-04-07 00:09:52.266472 | orchestrator | TASK [Get current user] ******************************************************** 2026-04-07 00:09:53.064537 | orchestrator | ok: [testbed-manager] 2026-04-07 00:09:53.064630 | orchestrator | 2026-04-07 00:09:53.064649 | orchestrator | TASK [Create venv directory] *************************************************** 2026-04-07 00:09:53.798134 | orchestrator | changed: [testbed-manager] 2026-04-07 00:09:53.798174 | orchestrator | 2026-04-07 00:09:53.798183 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2026-04-07 00:09:59.664642 | orchestrator | changed: [testbed-manager] 2026-04-07 00:09:59.664750 | orchestrator | 2026-04-07 00:09:59.664768 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2026-04-07 00:10:04.959054 | orchestrator | changed: [testbed-manager] 2026-04-07 00:10:04.959145 | orchestrator | 2026-04-07 00:10:04.959162 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2026-04-07 00:10:07.521031 | orchestrator | changed: [testbed-manager] 2026-04-07 00:10:07.521064 | orchestrator | 2026-04-07 00:10:07.521072 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2026-04-07 00:10:09.268419 | orchestrator | changed: [testbed-manager] 2026-04-07 00:10:09.268496 | orchestrator | 2026-04-07 00:10:09.268513 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2026-04-07 00:10:10.365380 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2026-04-07 00:10:10.365487 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2026-04-07 00:10:10.365503 | orchestrator | 2026-04-07 00:10:10.365520 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2026-04-07 00:10:10.405990 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2026-04-07 00:10:10.406052 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2026-04-07 00:10:10.406060 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2026-04-07 00:10:10.406066 | orchestrator | deprecation_warnings=False in ansible.cfg. 2026-04-07 00:10:13.681545 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2026-04-07 00:10:13.681608 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2026-04-07 00:10:13.681644 | orchestrator | 2026-04-07 00:10:13.681658 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2026-04-07 00:10:14.253911 | orchestrator | changed: [testbed-manager] 2026-04-07 00:10:14.253993 | orchestrator | 2026-04-07 00:10:14.254007 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2026-04-07 00:12:36.559411 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2026-04-07 00:12:36.559514 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2026-04-07 00:12:36.559526 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2026-04-07 00:12:36.559533 | orchestrator | 2026-04-07 00:12:36.559541 | orchestrator | TASK [Install local collections] *********************************************** 2026-04-07 00:12:38.854199 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2026-04-07 00:12:38.854291 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2026-04-07 00:12:38.854306 | orchestrator | 2026-04-07 00:12:38.854320 | orchestrator | PLAY [Create operator user] **************************************************** 2026-04-07 00:12:38.854332 | orchestrator | 2026-04-07 00:12:38.854344 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:12:40.252301 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:40.252388 | orchestrator | 2026-04-07 00:12:40.252404 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2026-04-07 00:12:40.297684 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:40.297771 | orchestrator | 2026-04-07 00:12:40.297788 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2026-04-07 00:12:40.363427 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:40.363569 | orchestrator | 2026-04-07 00:12:40.363588 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2026-04-07 00:12:41.149405 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:41.149718 | orchestrator | 2026-04-07 00:12:41.149746 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2026-04-07 00:12:41.831385 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:41.831982 | orchestrator | 2026-04-07 00:12:41.832019 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2026-04-07 00:12:43.184265 | orchestrator | changed: [testbed-manager] => (item=adm) 2026-04-07 00:12:43.184317 | orchestrator | changed: [testbed-manager] => (item=sudo) 2026-04-07 00:12:43.184324 | orchestrator | 2026-04-07 00:12:43.184331 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2026-04-07 00:12:44.587685 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:44.587755 | orchestrator | 2026-04-07 00:12:44.587770 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2026-04-07 00:12:46.402313 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:12:46.402389 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2026-04-07 00:12:46.402421 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:12:46.402432 | orchestrator | 2026-04-07 00:12:46.402444 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2026-04-07 00:12:46.458794 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:46.458878 | orchestrator | 2026-04-07 00:12:46.458893 | orchestrator | TASK [osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file] *** 2026-04-07 00:12:46.525107 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:46.525196 | orchestrator | 2026-04-07 00:12:46.525213 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2026-04-07 00:12:47.090175 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:47.090919 | orchestrator | 2026-04-07 00:12:47.090943 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2026-04-07 00:12:47.153731 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:47.153789 | orchestrator | 2026-04-07 00:12:47.153798 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2026-04-07 00:12:47.992307 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-07 00:12:47.992546 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:47.992565 | orchestrator | 2026-04-07 00:12:47.992575 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2026-04-07 00:12:48.034620 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:48.034708 | orchestrator | 2026-04-07 00:12:48.034724 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2026-04-07 00:12:48.071645 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:48.071740 | orchestrator | 2026-04-07 00:12:48.071757 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2026-04-07 00:12:48.108271 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:48.108357 | orchestrator | 2026-04-07 00:12:48.108373 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2026-04-07 00:12:48.178082 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:48.178198 | orchestrator | 2026-04-07 00:12:48.178223 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2026-04-07 00:12:48.890176 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:48.890261 | orchestrator | 2026-04-07 00:12:48.890276 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2026-04-07 00:12:48.890289 | orchestrator | 2026-04-07 00:12:48.890304 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:12:50.258093 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:50.258204 | orchestrator | 2026-04-07 00:12:50.258229 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2026-04-07 00:12:51.215504 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:51.215565 | orchestrator | 2026-04-07 00:12:51.215575 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:12:51.215583 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2026-04-07 00:12:51.215590 | orchestrator | 2026-04-07 00:12:51.569173 | orchestrator | ok: Runtime: 0:07:40.220103 2026-04-07 00:12:51.592578 | 2026-04-07 00:12:51.592792 | TASK [Point out that the log in on the manager is now possible] 2026-04-07 00:12:51.634210 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2026-04-07 00:12:51.647258 | 2026-04-07 00:12:51.647434 | TASK [Point out that the following task takes some time and does not give any output] 2026-04-07 00:12:51.687275 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2026-04-07 00:12:51.697184 | 2026-04-07 00:12:51.697317 | TASK [Run manager part 1 + 2] 2026-04-07 00:12:52.612767 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-07 00:12:52.667071 | orchestrator | 2026-04-07 00:12:52.667166 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2026-04-07 00:12:52.667187 | orchestrator | 2026-04-07 00:12:52.667228 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:12:55.580514 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:55.580722 | orchestrator | 2026-04-07 00:12:55.580789 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2026-04-07 00:12:55.616872 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:55.616962 | orchestrator | 2026-04-07 00:12:55.616998 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2026-04-07 00:12:55.672229 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:55.672321 | orchestrator | 2026-04-07 00:12:55.672343 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-04-07 00:12:55.710188 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:55.710290 | orchestrator | 2026-04-07 00:12:55.710316 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-04-07 00:12:55.778380 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:55.778494 | orchestrator | 2026-04-07 00:12:55.778513 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-04-07 00:12:55.837270 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:55.837357 | orchestrator | 2026-04-07 00:12:55.837376 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-04-07 00:12:55.882007 | orchestrator | included: /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2026-04-07 00:12:55.882135 | orchestrator | 2026-04-07 00:12:55.882151 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-04-07 00:12:56.586964 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:56.587055 | orchestrator | 2026-04-07 00:12:56.587081 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-04-07 00:12:56.640838 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:12:56.640930 | orchestrator | 2026-04-07 00:12:56.640946 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-04-07 00:12:58.005683 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:58.005782 | orchestrator | 2026-04-07 00:12:58.005803 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-04-07 00:12:58.561139 | orchestrator | ok: [testbed-manager] 2026-04-07 00:12:58.561229 | orchestrator | 2026-04-07 00:12:58.561246 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-04-07 00:12:59.729245 | orchestrator | changed: [testbed-manager] 2026-04-07 00:12:59.729317 | orchestrator | 2026-04-07 00:12:59.729331 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-04-07 00:13:14.070357 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:14.070422 | orchestrator | 2026-04-07 00:13:14.070447 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2026-04-07 00:13:14.758835 | orchestrator | ok: [testbed-manager] 2026-04-07 00:13:14.758902 | orchestrator | 2026-04-07 00:13:14.758914 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2026-04-07 00:13:14.810852 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:13:14.810937 | orchestrator | 2026-04-07 00:13:14.810956 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2026-04-07 00:13:15.689274 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:15.689305 | orchestrator | 2026-04-07 00:13:15.689401 | orchestrator | TASK [Copy SSH private key] **************************************************** 2026-04-07 00:13:16.557365 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:16.557401 | orchestrator | 2026-04-07 00:13:16.557407 | orchestrator | TASK [Create configuration directory] ****************************************** 2026-04-07 00:13:17.047385 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:17.047419 | orchestrator | 2026-04-07 00:13:17.047468 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2026-04-07 00:13:17.088261 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2026-04-07 00:13:17.088394 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2026-04-07 00:13:17.088420 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2026-04-07 00:13:17.088474 | orchestrator | deprecation_warnings=False in ansible.cfg. 2026-04-07 00:13:19.168676 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:19.168741 | orchestrator | 2026-04-07 00:13:19.168748 | orchestrator | TASK [Install python requirements in venv] ************************************* 2026-04-07 00:13:27.559312 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2026-04-07 00:13:27.559355 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2026-04-07 00:13:27.559363 | orchestrator | ok: [testbed-manager] => (item=packaging) 2026-04-07 00:13:27.559369 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2026-04-07 00:13:27.559379 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2026-04-07 00:13:27.559385 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2026-04-07 00:13:27.559390 | orchestrator | 2026-04-07 00:13:27.559397 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2026-04-07 00:13:28.533903 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:28.534010 | orchestrator | 2026-04-07 00:13:28.534101 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2026-04-07 00:13:31.378110 | orchestrator | changed: [testbed-manager] 2026-04-07 00:13:31.378199 | orchestrator | 2026-04-07 00:13:31.378215 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2026-04-07 00:13:31.417023 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:13:31.417123 | orchestrator | 2026-04-07 00:13:31.417142 | orchestrator | TASK [Run manager part 2] ****************************************************** 2026-04-07 00:15:08.026164 | orchestrator | changed: [testbed-manager] 2026-04-07 00:15:08.026243 | orchestrator | 2026-04-07 00:15:08.026258 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-04-07 00:15:09.156946 | orchestrator | ok: [testbed-manager] 2026-04-07 00:15:09.157019 | orchestrator | 2026-04-07 00:15:09.157037 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:15:09.157050 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 2026-04-07 00:15:09.157062 | orchestrator | 2026-04-07 00:15:09.335956 | orchestrator | ok: Runtime: 0:02:17.216701 2026-04-07 00:15:09.352778 | 2026-04-07 00:15:09.352911 | TASK [Reboot manager] 2026-04-07 00:15:10.886717 | orchestrator | ok: Runtime: 0:00:00.946281 2026-04-07 00:15:10.903098 | 2026-04-07 00:15:10.903252 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2026-04-07 00:15:27.343235 | orchestrator | ok 2026-04-07 00:15:27.354676 | 2026-04-07 00:15:27.354819 | TASK [Wait a little longer for the manager so that everything is ready] 2026-04-07 00:16:27.398314 | orchestrator | ok 2026-04-07 00:16:27.407535 | 2026-04-07 00:16:27.407648 | TASK [Deploy manager + bootstrap nodes] 2026-04-07 00:16:29.840042 | orchestrator | 2026-04-07 00:16:29.840252 | orchestrator | # DEPLOY MANAGER 2026-04-07 00:16:29.840318 | orchestrator | 2026-04-07 00:16:29.840335 | orchestrator | + set -e 2026-04-07 00:16:29.840349 | orchestrator | + echo 2026-04-07 00:16:29.840363 | orchestrator | + echo '# DEPLOY MANAGER' 2026-04-07 00:16:29.840386 | orchestrator | + echo 2026-04-07 00:16:29.840451 | orchestrator | + cat /opt/manager-vars.sh 2026-04-07 00:16:29.842652 | orchestrator | export NUMBER_OF_NODES=6 2026-04-07 00:16:29.842689 | orchestrator | 2026-04-07 00:16:29.842702 | orchestrator | export CEPH_VERSION= 2026-04-07 00:16:29.842715 | orchestrator | export CONFIGURATION_VERSION=main 2026-04-07 00:16:29.842727 | orchestrator | export MANAGER_VERSION=10.0.0 2026-04-07 00:16:29.842739 | orchestrator | export OPENSTACK_VERSION= 2026-04-07 00:16:29.842750 | orchestrator | 2026-04-07 00:16:29.842761 | orchestrator | export ARA=false 2026-04-07 00:16:29.842778 | orchestrator | export DEPLOY_MODE=manager 2026-04-07 00:16:29.842790 | orchestrator | export TEMPEST=true 2026-04-07 00:16:29.842801 | orchestrator | export IS_ZUUL=true 2026-04-07 00:16:29.842819 | orchestrator | 2026-04-07 00:16:29.842836 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:16:29.842848 | orchestrator | export EXTERNAL_API=false 2026-04-07 00:16:29.842859 | orchestrator | 2026-04-07 00:16:29.842876 | orchestrator | export IMAGE_USER=ubuntu 2026-04-07 00:16:29.842887 | orchestrator | export IMAGE_NODE_USER=ubuntu 2026-04-07 00:16:29.842898 | orchestrator | 2026-04-07 00:16:29.842912 | orchestrator | export CEPH_STACK=ceph-ansible 2026-04-07 00:16:29.842930 | orchestrator | 2026-04-07 00:16:29.842942 | orchestrator | + echo 2026-04-07 00:16:29.842954 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-04-07 00:16:29.843860 | orchestrator | ++ export INTERACTIVE=false 2026-04-07 00:16:29.843881 | orchestrator | ++ INTERACTIVE=false 2026-04-07 00:16:29.843895 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-04-07 00:16:29.843908 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-04-07 00:16:29.843925 | orchestrator | + source /opt/manager-vars.sh 2026-04-07 00:16:29.843939 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-04-07 00:16:29.843951 | orchestrator | ++ NUMBER_OF_NODES=6 2026-04-07 00:16:29.844036 | orchestrator | ++ export CEPH_VERSION= 2026-04-07 00:16:29.844051 | orchestrator | ++ CEPH_VERSION= 2026-04-07 00:16:29.844063 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-04-07 00:16:29.844074 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-04-07 00:16:29.844085 | orchestrator | ++ export MANAGER_VERSION=10.0.0 2026-04-07 00:16:29.844096 | orchestrator | ++ MANAGER_VERSION=10.0.0 2026-04-07 00:16:29.844107 | orchestrator | ++ export OPENSTACK_VERSION= 2026-04-07 00:16:29.844124 | orchestrator | ++ OPENSTACK_VERSION= 2026-04-07 00:16:29.844139 | orchestrator | ++ export ARA=false 2026-04-07 00:16:29.844151 | orchestrator | ++ ARA=false 2026-04-07 00:16:29.844161 | orchestrator | ++ export DEPLOY_MODE=manager 2026-04-07 00:16:29.844181 | orchestrator | ++ DEPLOY_MODE=manager 2026-04-07 00:16:29.844193 | orchestrator | ++ export TEMPEST=true 2026-04-07 00:16:29.844203 | orchestrator | ++ TEMPEST=true 2026-04-07 00:16:29.844221 | orchestrator | ++ export IS_ZUUL=true 2026-04-07 00:16:29.844232 | orchestrator | ++ IS_ZUUL=true 2026-04-07 00:16:29.844243 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:16:29.844254 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:16:29.844265 | orchestrator | ++ export EXTERNAL_API=false 2026-04-07 00:16:29.844301 | orchestrator | ++ EXTERNAL_API=false 2026-04-07 00:16:29.844312 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-04-07 00:16:29.844327 | orchestrator | ++ IMAGE_USER=ubuntu 2026-04-07 00:16:29.844339 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-04-07 00:16:29.844350 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-04-07 00:16:29.844361 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-04-07 00:16:29.844372 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-04-07 00:16:29.844383 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2026-04-07 00:16:29.895533 | orchestrator | + docker version 2026-04-07 00:16:30.005006 | orchestrator | Client: Docker Engine - Community 2026-04-07 00:16:30.005106 | orchestrator | Version: 27.5.1 2026-04-07 00:16:30.005120 | orchestrator | API version: 1.47 2026-04-07 00:16:30.005131 | orchestrator | Go version: go1.22.11 2026-04-07 00:16:30.005142 | orchestrator | Git commit: 9f9e405 2026-04-07 00:16:30.005153 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2026-04-07 00:16:30.005165 | orchestrator | OS/Arch: linux/amd64 2026-04-07 00:16:30.005176 | orchestrator | Context: default 2026-04-07 00:16:30.005187 | orchestrator | 2026-04-07 00:16:30.005198 | orchestrator | Server: Docker Engine - Community 2026-04-07 00:16:30.005209 | orchestrator | Engine: 2026-04-07 00:16:30.005220 | orchestrator | Version: 27.5.1 2026-04-07 00:16:30.005231 | orchestrator | API version: 1.47 (minimum version 1.24) 2026-04-07 00:16:30.005349 | orchestrator | Go version: go1.22.11 2026-04-07 00:16:30.005364 | orchestrator | Git commit: 4c9b3b0 2026-04-07 00:16:30.005375 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2026-04-07 00:16:30.005387 | orchestrator | OS/Arch: linux/amd64 2026-04-07 00:16:30.005397 | orchestrator | Experimental: false 2026-04-07 00:16:30.005409 | orchestrator | containerd: 2026-04-07 00:16:30.005420 | orchestrator | Version: v2.2.2 2026-04-07 00:16:30.005431 | orchestrator | GitCommit: 301b2dac98f15c27117da5c8af12118a041a31d9 2026-04-07 00:16:30.005442 | orchestrator | runc: 2026-04-07 00:16:30.005453 | orchestrator | Version: 1.3.4 2026-04-07 00:16:30.005464 | orchestrator | GitCommit: v1.3.4-0-gd6d73eb8 2026-04-07 00:16:30.005475 | orchestrator | docker-init: 2026-04-07 00:16:30.005486 | orchestrator | Version: 0.19.0 2026-04-07 00:16:30.005498 | orchestrator | GitCommit: de40ad0 2026-04-07 00:16:30.008401 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2026-04-07 00:16:30.019211 | orchestrator | + set -e 2026-04-07 00:16:30.019364 | orchestrator | + source /opt/manager-vars.sh 2026-04-07 00:16:30.019390 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-04-07 00:16:30.019410 | orchestrator | ++ NUMBER_OF_NODES=6 2026-04-07 00:16:30.019430 | orchestrator | ++ export CEPH_VERSION= 2026-04-07 00:16:30.019449 | orchestrator | ++ CEPH_VERSION= 2026-04-07 00:16:30.019468 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-04-07 00:16:30.019490 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-04-07 00:16:30.019511 | orchestrator | ++ export MANAGER_VERSION=10.0.0 2026-04-07 00:16:30.019534 | orchestrator | ++ MANAGER_VERSION=10.0.0 2026-04-07 00:16:30.019546 | orchestrator | ++ export OPENSTACK_VERSION= 2026-04-07 00:16:30.019557 | orchestrator | ++ OPENSTACK_VERSION= 2026-04-07 00:16:30.019568 | orchestrator | ++ export ARA=false 2026-04-07 00:16:30.019589 | orchestrator | ++ ARA=false 2026-04-07 00:16:30.019600 | orchestrator | ++ export DEPLOY_MODE=manager 2026-04-07 00:16:30.019611 | orchestrator | ++ DEPLOY_MODE=manager 2026-04-07 00:16:30.019635 | orchestrator | ++ export TEMPEST=true 2026-04-07 00:16:30.019647 | orchestrator | ++ TEMPEST=true 2026-04-07 00:16:30.019658 | orchestrator | ++ export IS_ZUUL=true 2026-04-07 00:16:30.019668 | orchestrator | ++ IS_ZUUL=true 2026-04-07 00:16:30.019679 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:16:30.019690 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:16:30.019701 | orchestrator | ++ export EXTERNAL_API=false 2026-04-07 00:16:30.019712 | orchestrator | ++ EXTERNAL_API=false 2026-04-07 00:16:30.019723 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-04-07 00:16:30.019734 | orchestrator | ++ IMAGE_USER=ubuntu 2026-04-07 00:16:30.019745 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-04-07 00:16:30.019755 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-04-07 00:16:30.019766 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-04-07 00:16:30.019777 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-04-07 00:16:30.019788 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-04-07 00:16:30.019799 | orchestrator | ++ export INTERACTIVE=false 2026-04-07 00:16:30.019809 | orchestrator | ++ INTERACTIVE=false 2026-04-07 00:16:30.019820 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-04-07 00:16:30.019835 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-04-07 00:16:30.019846 | orchestrator | + [[ 10.0.0 != \l\a\t\e\s\t ]] 2026-04-07 00:16:30.019857 | orchestrator | + /opt/configuration/scripts/set-manager-version.sh 10.0.0 2026-04-07 00:16:30.027835 | orchestrator | + set -e 2026-04-07 00:16:30.027907 | orchestrator | + VERSION=10.0.0 2026-04-07 00:16:30.027920 | orchestrator | + sed -i 's/manager_version: .*/manager_version: 10.0.0/g' /opt/configuration/environments/manager/configuration.yml 2026-04-07 00:16:30.036441 | orchestrator | + [[ 10.0.0 != \l\a\t\e\s\t ]] 2026-04-07 00:16:30.036514 | orchestrator | + sed -i /ceph_version:/d /opt/configuration/environments/manager/configuration.yml 2026-04-07 00:16:30.040724 | orchestrator | + sed -i /openstack_version:/d /opt/configuration/environments/manager/configuration.yml 2026-04-07 00:16:30.043840 | orchestrator | + sh -c /opt/configuration/scripts/sync-configuration-repository.sh 2026-04-07 00:16:30.052633 | orchestrator | /opt/configuration ~ 2026-04-07 00:16:30.052695 | orchestrator | + set -e 2026-04-07 00:16:30.052706 | orchestrator | + pushd /opt/configuration 2026-04-07 00:16:30.052717 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-07 00:16:30.054003 | orchestrator | + source /opt/venv/bin/activate 2026-04-07 00:16:30.055119 | orchestrator | ++ deactivate nondestructive 2026-04-07 00:16:30.055138 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:30.055150 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:30.055167 | orchestrator | ++ hash -r 2026-04-07 00:16:30.055201 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:30.055211 | orchestrator | ++ unset VIRTUAL_ENV 2026-04-07 00:16:30.055221 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2026-04-07 00:16:30.055230 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2026-04-07 00:16:30.055246 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2026-04-07 00:16:30.055256 | orchestrator | ++ '[' linux-gnu = msys ']' 2026-04-07 00:16:30.055289 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2026-04-07 00:16:30.055299 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2026-04-07 00:16:30.055312 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-07 00:16:30.055322 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-07 00:16:30.055332 | orchestrator | ++ export PATH 2026-04-07 00:16:30.055342 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:30.055351 | orchestrator | ++ '[' -z '' ']' 2026-04-07 00:16:30.055361 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2026-04-07 00:16:30.055370 | orchestrator | ++ PS1='(venv) ' 2026-04-07 00:16:30.055380 | orchestrator | ++ export PS1 2026-04-07 00:16:30.055390 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2026-04-07 00:16:30.055399 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2026-04-07 00:16:30.055409 | orchestrator | ++ hash -r 2026-04-07 00:16:30.055419 | orchestrator | + pip3 install --no-cache-dir python-gilt==1.2.3 requests Jinja2 PyYAML packaging 2026-04-07 00:16:30.975665 | orchestrator | Requirement already satisfied: python-gilt==1.2.3 in /opt/venv/lib/python3.12/site-packages (1.2.3) 2026-04-07 00:16:30.976570 | orchestrator | Requirement already satisfied: requests in /opt/venv/lib/python3.12/site-packages (2.33.1) 2026-04-07 00:16:30.977755 | orchestrator | Requirement already satisfied: Jinja2 in /opt/venv/lib/python3.12/site-packages (3.1.6) 2026-04-07 00:16:30.978991 | orchestrator | Requirement already satisfied: PyYAML in /opt/venv/lib/python3.12/site-packages (6.0.3) 2026-04-07 00:16:30.980094 | orchestrator | Requirement already satisfied: packaging in /opt/venv/lib/python3.12/site-packages (26.0) 2026-04-07 00:16:30.989635 | orchestrator | Requirement already satisfied: click in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (8.3.2) 2026-04-07 00:16:30.990946 | orchestrator | Requirement already satisfied: colorama in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.4.6) 2026-04-07 00:16:30.992008 | orchestrator | Requirement already satisfied: fasteners in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.20) 2026-04-07 00:16:30.993231 | orchestrator | Requirement already satisfied: sh in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (2.2.2) 2026-04-07 00:16:31.026813 | orchestrator | Requirement already satisfied: charset_normalizer<4,>=2 in /opt/venv/lib/python3.12/site-packages (from requests) (3.4.7) 2026-04-07 00:16:31.028102 | orchestrator | Requirement already satisfied: idna<4,>=2.5 in /opt/venv/lib/python3.12/site-packages (from requests) (3.11) 2026-04-07 00:16:31.029933 | orchestrator | Requirement already satisfied: urllib3<3,>=1.26 in /opt/venv/lib/python3.12/site-packages (from requests) (2.6.3) 2026-04-07 00:16:31.031107 | orchestrator | Requirement already satisfied: certifi>=2023.5.7 in /opt/venv/lib/python3.12/site-packages (from requests) (2026.2.25) 2026-04-07 00:16:31.035169 | orchestrator | Requirement already satisfied: MarkupSafe>=2.0 in /opt/venv/lib/python3.12/site-packages (from Jinja2) (3.0.3) 2026-04-07 00:16:31.234117 | orchestrator | ++ which gilt 2026-04-07 00:16:31.237380 | orchestrator | + GILT=/opt/venv/bin/gilt 2026-04-07 00:16:31.237414 | orchestrator | + /opt/venv/bin/gilt overlay 2026-04-07 00:16:31.462907 | orchestrator | osism.cfg-generics: 2026-04-07 00:16:31.590572 | orchestrator | - copied (v0.20260319.0) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/environments/manager/images.yml to /opt/configuration/environments/manager/ 2026-04-07 00:16:31.590697 | orchestrator | - copied (v0.20260319.0) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/render-images.py to /opt/configuration/environments/manager/ 2026-04-07 00:16:31.590741 | orchestrator | - copied (v0.20260319.0) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/set-versions.py to /opt/configuration/environments/ 2026-04-07 00:16:31.590766 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh render-images` in /opt/configuration/environments/manager/ 2026-04-07 00:16:32.330310 | orchestrator | - running `rm render-images.py` in /opt/configuration/environments/manager/ 2026-04-07 00:16:32.337231 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh set-versions` in /opt/configuration/environments/ 2026-04-07 00:16:32.637218 | orchestrator | - running `rm set-versions.py` in /opt/configuration/environments/ 2026-04-07 00:16:32.677910 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-07 00:16:32.678555 | orchestrator | ~ 2026-04-07 00:16:32.678654 | orchestrator | + deactivate 2026-04-07 00:16:32.678673 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2026-04-07 00:16:32.678687 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-07 00:16:32.678699 | orchestrator | + export PATH 2026-04-07 00:16:32.678710 | orchestrator | + unset _OLD_VIRTUAL_PATH 2026-04-07 00:16:32.678722 | orchestrator | + '[' -n '' ']' 2026-04-07 00:16:32.678733 | orchestrator | + hash -r 2026-04-07 00:16:32.678744 | orchestrator | + '[' -n '' ']' 2026-04-07 00:16:32.678755 | orchestrator | + unset VIRTUAL_ENV 2026-04-07 00:16:32.678766 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2026-04-07 00:16:32.678777 | orchestrator | + '[' '!' '' = nondestructive ']' 2026-04-07 00:16:32.678788 | orchestrator | + unset -f deactivate 2026-04-07 00:16:32.678799 | orchestrator | + popd 2026-04-07 00:16:32.679642 | orchestrator | + [[ 10.0.0 == \l\a\t\e\s\t ]] 2026-04-07 00:16:32.679688 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2026-04-07 00:16:32.680216 | orchestrator | ++ semver 10.0.0 7.0.0 2026-04-07 00:16:32.725471 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-07 00:16:32.725565 | orchestrator | + echo 'enable_osism_kubernetes: true' 2026-04-07 00:16:32.726145 | orchestrator | ++ semver 10.0.0 10.0.0-0 2026-04-07 00:16:32.782753 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-07 00:16:32.782847 | orchestrator | + sed -i '/^om_enable_rabbitmq_high_availability:/d' /opt/configuration/environments/kolla/configuration.yml 2026-04-07 00:16:32.788013 | orchestrator | + sed -i '/^om_enable_rabbitmq_quorum_queues:/d' /opt/configuration/environments/kolla/configuration.yml 2026-04-07 00:16:32.792361 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2026-04-07 00:16:32.865580 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-07 00:16:32.865710 | orchestrator | + source /opt/venv/bin/activate 2026-04-07 00:16:32.865725 | orchestrator | ++ deactivate nondestructive 2026-04-07 00:16:32.865737 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:32.865748 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:32.865759 | orchestrator | ++ hash -r 2026-04-07 00:16:32.865782 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:32.865797 | orchestrator | ++ unset VIRTUAL_ENV 2026-04-07 00:16:32.865808 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2026-04-07 00:16:32.865819 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2026-04-07 00:16:32.865831 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2026-04-07 00:16:32.865842 | orchestrator | ++ '[' linux-gnu = msys ']' 2026-04-07 00:16:32.865853 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2026-04-07 00:16:32.865864 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2026-04-07 00:16:32.865876 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-07 00:16:32.865887 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-07 00:16:32.865898 | orchestrator | ++ export PATH 2026-04-07 00:16:32.865942 | orchestrator | ++ '[' -n '' ']' 2026-04-07 00:16:32.865955 | orchestrator | ++ '[' -z '' ']' 2026-04-07 00:16:32.865966 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2026-04-07 00:16:32.865977 | orchestrator | ++ PS1='(venv) ' 2026-04-07 00:16:32.865988 | orchestrator | ++ export PS1 2026-04-07 00:16:32.866010 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2026-04-07 00:16:32.866084 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2026-04-07 00:16:32.866096 | orchestrator | ++ hash -r 2026-04-07 00:16:32.866107 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2026-04-07 00:16:36.088920 | orchestrator | 2026-04-07 00:16:36.089012 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2026-04-07 00:16:36.089028 | orchestrator | 2026-04-07 00:16:36.089040 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-04-07 00:16:36.647701 | orchestrator | ok: [testbed-manager] 2026-04-07 00:16:36.647792 | orchestrator | 2026-04-07 00:16:36.647805 | orchestrator | TASK [Copy fact files] ********************************************************* 2026-04-07 00:16:37.629299 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:37.629406 | orchestrator | 2026-04-07 00:16:37.629423 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2026-04-07 00:16:37.629435 | orchestrator | 2026-04-07 00:16:37.629447 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:16:39.844663 | orchestrator | ok: [testbed-manager] 2026-04-07 00:16:39.844749 | orchestrator | 2026-04-07 00:16:39.844768 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2026-04-07 00:16:39.897597 | orchestrator | ok: [testbed-manager] 2026-04-07 00:16:39.897683 | orchestrator | 2026-04-07 00:16:39.897698 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2026-04-07 00:16:40.329328 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:40.329388 | orchestrator | 2026-04-07 00:16:40.329398 | orchestrator | TASK [Add netbox_enable parameter] ********************************************* 2026-04-07 00:16:40.372667 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:16:40.372745 | orchestrator | 2026-04-07 00:16:40.372760 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2026-04-07 00:16:40.695605 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:40.695677 | orchestrator | 2026-04-07 00:16:40.695684 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2026-04-07 00:16:41.026566 | orchestrator | ok: [testbed-manager] 2026-04-07 00:16:41.026656 | orchestrator | 2026-04-07 00:16:41.026671 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2026-04-07 00:16:41.132002 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:16:41.132090 | orchestrator | 2026-04-07 00:16:41.132105 | orchestrator | PLAY [Apply role traefik] ****************************************************** 2026-04-07 00:16:41.132117 | orchestrator | 2026-04-07 00:16:41.132129 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:16:42.878760 | orchestrator | ok: [testbed-manager] 2026-04-07 00:16:42.878870 | orchestrator | 2026-04-07 00:16:42.878887 | orchestrator | TASK [Apply traefik role] ****************************************************** 2026-04-07 00:16:42.967805 | orchestrator | included: osism.services.traefik for testbed-manager 2026-04-07 00:16:42.967884 | orchestrator | 2026-04-07 00:16:42.967902 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2026-04-07 00:16:43.020422 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2026-04-07 00:16:43.020535 | orchestrator | 2026-04-07 00:16:43.020562 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2026-04-07 00:16:45.309883 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2026-04-07 00:16:45.309971 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2026-04-07 00:16:45.309985 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2026-04-07 00:16:45.309997 | orchestrator | 2026-04-07 00:16:45.310009 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2026-04-07 00:16:47.117908 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2026-04-07 00:16:47.117976 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2026-04-07 00:16:47.117983 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2026-04-07 00:16:47.117991 | orchestrator | 2026-04-07 00:16:47.117998 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2026-04-07 00:16:47.754943 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-07 00:16:47.755047 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:47.755063 | orchestrator | 2026-04-07 00:16:47.755075 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2026-04-07 00:16:48.381594 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-07 00:16:48.381691 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:48.381707 | orchestrator | 2026-04-07 00:16:48.381719 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2026-04-07 00:16:48.441140 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:16:48.441229 | orchestrator | 2026-04-07 00:16:48.441244 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2026-04-07 00:16:48.799306 | orchestrator | ok: [testbed-manager] 2026-04-07 00:16:48.799422 | orchestrator | 2026-04-07 00:16:48.799438 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2026-04-07 00:16:48.868115 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2026-04-07 00:16:48.868188 | orchestrator | 2026-04-07 00:16:48.868198 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2026-04-07 00:16:49.920131 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:49.920236 | orchestrator | 2026-04-07 00:16:49.920252 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2026-04-07 00:16:50.691516 | orchestrator | changed: [testbed-manager] 2026-04-07 00:16:50.691566 | orchestrator | 2026-04-07 00:16:50.691572 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2026-04-07 00:17:01.521390 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:01.521495 | orchestrator | 2026-04-07 00:17:01.521512 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2026-04-07 00:17:01.567004 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:17:01.567104 | orchestrator | 2026-04-07 00:17:01.567118 | orchestrator | PLAY [Deploy manager service] ************************************************** 2026-04-07 00:17:01.567131 | orchestrator | 2026-04-07 00:17:01.567142 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:17:03.276509 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:03.276606 | orchestrator | 2026-04-07 00:17:03.276621 | orchestrator | TASK [Apply manager role] ****************************************************** 2026-04-07 00:17:03.392722 | orchestrator | included: osism.services.manager for testbed-manager 2026-04-07 00:17:03.392839 | orchestrator | 2026-04-07 00:17:03.392865 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2026-04-07 00:17:03.442758 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2026-04-07 00:17:03.442843 | orchestrator | 2026-04-07 00:17:03.442858 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2026-04-07 00:17:05.907926 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:05.908004 | orchestrator | 2026-04-07 00:17:05.908017 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2026-04-07 00:17:05.953890 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:05.953978 | orchestrator | 2026-04-07 00:17:05.953995 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2026-04-07 00:17:06.067978 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2026-04-07 00:17:06.068056 | orchestrator | 2026-04-07 00:17:06.068086 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2026-04-07 00:17:08.743598 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2026-04-07 00:17:08.743719 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2026-04-07 00:17:08.743750 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2026-04-07 00:17:08.743771 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2026-04-07 00:17:08.743784 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2026-04-07 00:17:08.743795 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2026-04-07 00:17:08.743806 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2026-04-07 00:17:08.743817 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2026-04-07 00:17:08.743828 | orchestrator | 2026-04-07 00:17:08.743841 | orchestrator | TASK [osism.services.manager : Copy all environment file] ********************** 2026-04-07 00:17:09.311747 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:09.311840 | orchestrator | 2026-04-07 00:17:09.311855 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2026-04-07 00:17:09.895735 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:09.895822 | orchestrator | 2026-04-07 00:17:09.895836 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2026-04-07 00:17:09.970770 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2026-04-07 00:17:09.970889 | orchestrator | 2026-04-07 00:17:09.970908 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2026-04-07 00:17:11.075887 | orchestrator | changed: [testbed-manager] => (item=ara) 2026-04-07 00:17:11.075983 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2026-04-07 00:17:11.075998 | orchestrator | 2026-04-07 00:17:11.076011 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2026-04-07 00:17:11.687652 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:11.687708 | orchestrator | 2026-04-07 00:17:11.687718 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2026-04-07 00:17:11.730690 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:17:11.730775 | orchestrator | 2026-04-07 00:17:11.730822 | orchestrator | TASK [osism.services.manager : Include frontend config tasks] ****************** 2026-04-07 00:17:11.830102 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-frontend.yml for testbed-manager 2026-04-07 00:17:11.830200 | orchestrator | 2026-04-07 00:17:11.830215 | orchestrator | TASK [osism.services.manager : Copy frontend environment file] ***************** 2026-04-07 00:17:12.393708 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:12.393769 | orchestrator | 2026-04-07 00:17:12.393775 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2026-04-07 00:17:12.441420 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2026-04-07 00:17:12.441489 | orchestrator | 2026-04-07 00:17:12.441498 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2026-04-07 00:17:13.714012 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-07 00:17:13.714147 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-07 00:17:13.714163 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:13.714176 | orchestrator | 2026-04-07 00:17:13.714188 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2026-04-07 00:17:14.322880 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:14.322975 | orchestrator | 2026-04-07 00:17:14.322991 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2026-04-07 00:17:14.381376 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:17:14.381488 | orchestrator | 2026-04-07 00:17:14.381503 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2026-04-07 00:17:14.473109 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2026-04-07 00:17:14.473204 | orchestrator | 2026-04-07 00:17:14.473219 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2026-04-07 00:17:14.968740 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:14.968850 | orchestrator | 2026-04-07 00:17:14.968878 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2026-04-07 00:17:15.355802 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:15.355864 | orchestrator | 2026-04-07 00:17:15.355870 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2026-04-07 00:17:16.591893 | orchestrator | changed: [testbed-manager] => (item=conductor) 2026-04-07 00:17:16.591983 | orchestrator | changed: [testbed-manager] => (item=openstack) 2026-04-07 00:17:16.591998 | orchestrator | 2026-04-07 00:17:16.592012 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2026-04-07 00:17:17.251744 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:17.251839 | orchestrator | 2026-04-07 00:17:17.251854 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2026-04-07 00:17:17.652592 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:17.652685 | orchestrator | 2026-04-07 00:17:17.652702 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2026-04-07 00:17:18.043789 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:18.043882 | orchestrator | 2026-04-07 00:17:18.043899 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2026-04-07 00:17:18.094531 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:17:18.094621 | orchestrator | 2026-04-07 00:17:18.094637 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2026-04-07 00:17:18.171816 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2026-04-07 00:17:18.171950 | orchestrator | 2026-04-07 00:17:18.171977 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2026-04-07 00:17:18.216260 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:18.216365 | orchestrator | 2026-04-07 00:17:18.216381 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2026-04-07 00:17:20.285808 | orchestrator | changed: [testbed-manager] => (item=osism) 2026-04-07 00:17:20.285909 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2026-04-07 00:17:20.285924 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2026-04-07 00:17:20.285934 | orchestrator | 2026-04-07 00:17:20.285946 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2026-04-07 00:17:21.002235 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:21.002385 | orchestrator | 2026-04-07 00:17:21.002412 | orchestrator | TASK [osism.services.manager : Copy hubble wrapper script] ********************* 2026-04-07 00:17:21.710393 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:21.710490 | orchestrator | 2026-04-07 00:17:21.710506 | orchestrator | TASK [osism.services.manager : Copy flux wrapper script] *********************** 2026-04-07 00:17:22.335610 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:22.335702 | orchestrator | 2026-04-07 00:17:22.335716 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2026-04-07 00:17:22.407690 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2026-04-07 00:17:22.407771 | orchestrator | 2026-04-07 00:17:22.407786 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2026-04-07 00:17:22.448569 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:22.448648 | orchestrator | 2026-04-07 00:17:22.448662 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2026-04-07 00:17:23.093708 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2026-04-07 00:17:23.093804 | orchestrator | 2026-04-07 00:17:23.093820 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2026-04-07 00:17:23.177841 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2026-04-07 00:17:23.177944 | orchestrator | 2026-04-07 00:17:23.177961 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2026-04-07 00:17:23.819372 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:23.819500 | orchestrator | 2026-04-07 00:17:23.819517 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2026-04-07 00:17:24.377648 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:24.377732 | orchestrator | 2026-04-07 00:17:24.377746 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2026-04-07 00:17:24.426230 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:17:24.426347 | orchestrator | 2026-04-07 00:17:24.426373 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2026-04-07 00:17:24.477753 | orchestrator | ok: [testbed-manager] 2026-04-07 00:17:24.477840 | orchestrator | 2026-04-07 00:17:24.477855 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2026-04-07 00:17:25.234770 | orchestrator | changed: [testbed-manager] 2026-04-07 00:17:25.234868 | orchestrator | 2026-04-07 00:17:25.234884 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2026-04-07 00:18:34.519591 | orchestrator | changed: [testbed-manager] 2026-04-07 00:18:34.519726 | orchestrator | 2026-04-07 00:18:34.520593 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2026-04-07 00:18:35.476017 | orchestrator | ok: [testbed-manager] 2026-04-07 00:18:35.476113 | orchestrator | 2026-04-07 00:18:35.476127 | orchestrator | TASK [osism.services.manager : Do a manual start of the manager service] ******* 2026-04-07 00:18:35.532734 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:18:35.532812 | orchestrator | 2026-04-07 00:18:35.532825 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2026-04-07 00:18:46.055483 | orchestrator | changed: [testbed-manager] 2026-04-07 00:18:46.055568 | orchestrator | 2026-04-07 00:18:46.055579 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2026-04-07 00:18:46.108323 | orchestrator | ok: [testbed-manager] 2026-04-07 00:18:46.108405 | orchestrator | 2026-04-07 00:18:46.108418 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2026-04-07 00:18:46.108430 | orchestrator | 2026-04-07 00:18:46.108440 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2026-04-07 00:18:46.241022 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:18:46.241106 | orchestrator | 2026-04-07 00:18:46.241122 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2026-04-07 00:19:46.292194 | orchestrator | Pausing for 60 seconds 2026-04-07 00:19:46.292311 | orchestrator | changed: [testbed-manager] 2026-04-07 00:19:46.292328 | orchestrator | 2026-04-07 00:19:46.292341 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2026-04-07 00:19:49.293127 | orchestrator | changed: [testbed-manager] 2026-04-07 00:19:49.293237 | orchestrator | 2026-04-07 00:19:49.293253 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2026-04-07 00:20:30.694245 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2026-04-07 00:20:30.694353 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2026-04-07 00:20:30.694369 | orchestrator | changed: [testbed-manager] 2026-04-07 00:20:30.694382 | orchestrator | 2026-04-07 00:20:30.694394 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2026-04-07 00:20:35.893367 | orchestrator | changed: [testbed-manager] 2026-04-07 00:20:35.893458 | orchestrator | 2026-04-07 00:20:35.893471 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2026-04-07 00:20:35.967969 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2026-04-07 00:20:35.968064 | orchestrator | 2026-04-07 00:20:35.968078 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2026-04-07 00:20:35.968091 | orchestrator | 2026-04-07 00:20:35.968103 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2026-04-07 00:20:36.009385 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:20:36.009475 | orchestrator | 2026-04-07 00:20:36.009490 | orchestrator | TASK [osism.services.manager : Include version verification tasks] ************* 2026-04-07 00:20:36.081281 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/verify-versions.yml for testbed-manager 2026-04-07 00:20:36.081392 | orchestrator | 2026-04-07 00:20:36.081420 | orchestrator | TASK [osism.services.manager : Deploy service manager version check script] **** 2026-04-07 00:20:36.741908 | orchestrator | changed: [testbed-manager] 2026-04-07 00:20:36.742011 | orchestrator | 2026-04-07 00:20:36.742085 | orchestrator | TASK [osism.services.manager : Execute service manager version check] ********** 2026-04-07 00:20:39.713555 | orchestrator | ok: [testbed-manager] 2026-04-07 00:20:39.713720 | orchestrator | 2026-04-07 00:20:39.713740 | orchestrator | TASK [osism.services.manager : Display version check results] ****************** 2026-04-07 00:20:39.781456 | orchestrator | ok: [testbed-manager] => { 2026-04-07 00:20:39.781547 | orchestrator | "version_check_result.stdout_lines": [ 2026-04-07 00:20:39.781561 | orchestrator | "=== OSISM Container Version Check ===", 2026-04-07 00:20:39.781572 | orchestrator | "Checking running containers against expected versions...", 2026-04-07 00:20:39.781583 | orchestrator | "", 2026-04-07 00:20:39.781593 | orchestrator | "Checking service: inventory_reconciler (Inventory Reconciler Service)", 2026-04-07 00:20:39.781604 | orchestrator | " Expected: registry.osism.tech/osism/inventory-reconciler:0.20260322.0", 2026-04-07 00:20:39.781614 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.781668 | orchestrator | " Running: registry.osism.tech/osism/inventory-reconciler:0.20260322.0", 2026-04-07 00:20:39.781680 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.781690 | orchestrator | "", 2026-04-07 00:20:39.781700 | orchestrator | "Checking service: osism-ansible (OSISM Ansible Service)", 2026-04-07 00:20:39.781744 | orchestrator | " Expected: registry.osism.tech/osism/osism-ansible:0.20260322.0", 2026-04-07 00:20:39.781755 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.781765 | orchestrator | " Running: registry.osism.tech/osism/osism-ansible:0.20260322.0", 2026-04-07 00:20:39.781775 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.781784 | orchestrator | "", 2026-04-07 00:20:39.781794 | orchestrator | "Checking service: osism-kubernetes (Osism-Kubernetes Service)", 2026-04-07 00:20:39.781803 | orchestrator | " Expected: registry.osism.tech/osism/osism-kubernetes:0.20260322.0", 2026-04-07 00:20:39.781813 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.781822 | orchestrator | " Running: registry.osism.tech/osism/osism-kubernetes:0.20260322.0", 2026-04-07 00:20:39.781832 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.781841 | orchestrator | "", 2026-04-07 00:20:39.781851 | orchestrator | "Checking service: ceph-ansible (Ceph-Ansible Service)", 2026-04-07 00:20:39.781861 | orchestrator | " Expected: registry.osism.tech/osism/ceph-ansible:0.20260322.0", 2026-04-07 00:20:39.781871 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.781880 | orchestrator | " Running: registry.osism.tech/osism/ceph-ansible:0.20260322.0", 2026-04-07 00:20:39.781890 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.781899 | orchestrator | "", 2026-04-07 00:20:39.781908 | orchestrator | "Checking service: kolla-ansible (Kolla-Ansible Service)", 2026-04-07 00:20:39.781918 | orchestrator | " Expected: registry.osism.tech/osism/kolla-ansible:0.20260328.0", 2026-04-07 00:20:39.781927 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.781937 | orchestrator | " Running: registry.osism.tech/osism/kolla-ansible:0.20260328.0", 2026-04-07 00:20:39.781946 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.781956 | orchestrator | "", 2026-04-07 00:20:39.781965 | orchestrator | "Checking service: osismclient (OSISM Client)", 2026-04-07 00:20:39.781975 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.781984 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.781993 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782003 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782013 | orchestrator | "", 2026-04-07 00:20:39.782078 | orchestrator | "Checking service: ara-server (ARA Server)", 2026-04-07 00:20:39.782089 | orchestrator | " Expected: registry.osism.tech/osism/ara-server:1.7.3", 2026-04-07 00:20:39.782098 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782108 | orchestrator | " Running: registry.osism.tech/osism/ara-server:1.7.3", 2026-04-07 00:20:39.782119 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782129 | orchestrator | "", 2026-04-07 00:20:39.782139 | orchestrator | "Checking service: mariadb (MariaDB for ARA)", 2026-04-07 00:20:39.782148 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/mariadb:11.8.4", 2026-04-07 00:20:39.782158 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782168 | orchestrator | " Running: registry.osism.tech/dockerhub/library/mariadb:11.8.4", 2026-04-07 00:20:39.782177 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782187 | orchestrator | "", 2026-04-07 00:20:39.782196 | orchestrator | "Checking service: frontend (OSISM Frontend)", 2026-04-07 00:20:39.782206 | orchestrator | " Expected: registry.osism.tech/osism/osism-frontend:0.20260320.0", 2026-04-07 00:20:39.782215 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782225 | orchestrator | " Running: registry.osism.tech/osism/osism-frontend:0.20260320.0", 2026-04-07 00:20:39.782234 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782244 | orchestrator | "", 2026-04-07 00:20:39.782254 | orchestrator | "Checking service: redis (Redis Cache)", 2026-04-07 00:20:39.782263 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/redis:7.4.7-alpine", 2026-04-07 00:20:39.782272 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782282 | orchestrator | " Running: registry.osism.tech/dockerhub/library/redis:7.4.7-alpine", 2026-04-07 00:20:39.782292 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782301 | orchestrator | "", 2026-04-07 00:20:39.782311 | orchestrator | "Checking service: api (OSISM API Service)", 2026-04-07 00:20:39.782330 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782339 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782349 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782358 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782368 | orchestrator | "", 2026-04-07 00:20:39.782383 | orchestrator | "Checking service: listener (OpenStack Event Listener)", 2026-04-07 00:20:39.782393 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782403 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782413 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782422 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782432 | orchestrator | "", 2026-04-07 00:20:39.782443 | orchestrator | "Checking service: openstack (OpenStack Integration)", 2026-04-07 00:20:39.782453 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782462 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782472 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782482 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782492 | orchestrator | "", 2026-04-07 00:20:39.782501 | orchestrator | "Checking service: beat (Celery Beat Scheduler)", 2026-04-07 00:20:39.782511 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782521 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782530 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782557 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782567 | orchestrator | "", 2026-04-07 00:20:39.782577 | orchestrator | "Checking service: flower (Celery Flower Monitor)", 2026-04-07 00:20:39.782587 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782596 | orchestrator | " Enabled: true", 2026-04-07 00:20:39.782606 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-07 00:20:39.782616 | orchestrator | " Status: ✅ MATCH", 2026-04-07 00:20:39.782658 | orchestrator | "", 2026-04-07 00:20:39.782669 | orchestrator | "=== Summary ===", 2026-04-07 00:20:39.782679 | orchestrator | "Errors (version mismatches): 0", 2026-04-07 00:20:39.782689 | orchestrator | "Warnings (expected containers not running): 0", 2026-04-07 00:20:39.782699 | orchestrator | "", 2026-04-07 00:20:39.782708 | orchestrator | "✅ All running containers match expected versions!" 2026-04-07 00:20:39.782718 | orchestrator | ] 2026-04-07 00:20:39.782728 | orchestrator | } 2026-04-07 00:20:39.782738 | orchestrator | 2026-04-07 00:20:39.782748 | orchestrator | TASK [osism.services.manager : Skip version check due to service configuration] *** 2026-04-07 00:20:39.827287 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:20:39.827384 | orchestrator | 2026-04-07 00:20:39.827399 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:20:39.827412 | orchestrator | testbed-manager : ok=70 changed=37 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2026-04-07 00:20:39.827424 | orchestrator | 2026-04-07 00:20:39.894871 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-07 00:20:39.894972 | orchestrator | + deactivate 2026-04-07 00:20:39.894988 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2026-04-07 00:20:39.895002 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-07 00:20:39.895013 | orchestrator | + export PATH 2026-04-07 00:20:39.895024 | orchestrator | + unset _OLD_VIRTUAL_PATH 2026-04-07 00:20:39.895036 | orchestrator | + '[' -n '' ']' 2026-04-07 00:20:39.895047 | orchestrator | + hash -r 2026-04-07 00:20:39.895057 | orchestrator | + '[' -n '' ']' 2026-04-07 00:20:39.895068 | orchestrator | + unset VIRTUAL_ENV 2026-04-07 00:20:39.895079 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2026-04-07 00:20:39.895090 | orchestrator | + '[' '!' '' = nondestructive ']' 2026-04-07 00:20:39.895101 | orchestrator | + unset -f deactivate 2026-04-07 00:20:39.895112 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2026-04-07 00:20:39.900097 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2026-04-07 00:20:39.900143 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2026-04-07 00:20:39.900184 | orchestrator | + local max_attempts=60 2026-04-07 00:20:39.900198 | orchestrator | + local name=ceph-ansible 2026-04-07 00:20:39.900210 | orchestrator | + local attempt_num=1 2026-04-07 00:20:39.900912 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:20:39.934512 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:20:39.934620 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2026-04-07 00:20:39.934735 | orchestrator | + local max_attempts=60 2026-04-07 00:20:39.934759 | orchestrator | + local name=kolla-ansible 2026-04-07 00:20:39.934779 | orchestrator | + local attempt_num=1 2026-04-07 00:20:39.934906 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2026-04-07 00:20:39.965994 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:20:39.966138 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2026-04-07 00:20:39.966152 | orchestrator | + local max_attempts=60 2026-04-07 00:20:39.966164 | orchestrator | + local name=osism-ansible 2026-04-07 00:20:39.966175 | orchestrator | + local attempt_num=1 2026-04-07 00:20:39.966793 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2026-04-07 00:20:40.002777 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:20:40.002867 | orchestrator | + [[ true == \t\r\u\e ]] 2026-04-07 00:20:40.002882 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2026-04-07 00:20:40.627498 | orchestrator | + docker compose --project-directory /opt/manager ps 2026-04-07 00:20:40.792014 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2026-04-07 00:20:40.792134 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:0.20260322.0 "/entrypoint.sh osis…" ceph-ansible About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792159 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:0.20260328.0 "/entrypoint.sh osis…" kolla-ansible About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792179 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" api About a minute ago Up About a minute (healthy) 192.168.16.5:8000->8000/tcp 2026-04-07 00:20:40.792217 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.3 "sh -c '/wait && /ru…" ara-server About a minute ago Up About a minute (healthy) 8000/tcp 2026-04-07 00:20:40.792239 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" beat About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792260 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" flower About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792279 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:0.20260322.0 "/sbin/tini -- /entr…" inventory_reconciler About a minute ago Up 51 seconds (healthy) 2026-04-07 00:20:40.792299 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" listener About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792318 | orchestrator | manager-mariadb-1 registry.osism.tech/dockerhub/library/mariadb:11.8.4 "docker-entrypoint.s…" mariadb About a minute ago Up About a minute (healthy) 3306/tcp 2026-04-07 00:20:40.792338 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" openstack About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792358 | orchestrator | manager-redis-1 registry.osism.tech/dockerhub/library/redis:7.4.7-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp 2026-04-07 00:20:40.792409 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:0.20260322.0 "/entrypoint.sh osis…" osism-ansible About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792429 | orchestrator | osism-frontend registry.osism.tech/osism/osism-frontend:0.20260320.0 "docker-entrypoint.s…" frontend About a minute ago Up About a minute 192.168.16.5:3000->3000/tcp 2026-04-07 00:20:40.792440 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:0.20260322.0 "/entrypoint.sh osis…" osism-kubernetes About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.792452 | orchestrator | osismclient registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- sleep…" osismclient About a minute ago Up About a minute (healthy) 2026-04-07 00:20:40.795476 | orchestrator | ++ semver 10.0.0 7.0.0 2026-04-07 00:20:40.824297 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-07 00:20:40.824382 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2026-04-07 00:20:40.826359 | orchestrator | + osism apply resolvconf -l testbed-manager 2026-04-07 00:20:53.240358 | orchestrator | 2026-04-07 00:20:53 | INFO  | Prepare task for execution of resolvconf. 2026-04-07 00:20:53.452715 | orchestrator | 2026-04-07 00:20:53 | INFO  | Task 1d776026-f721-489b-9526-58ac5fd77362 (resolvconf) was prepared for execution. 2026-04-07 00:20:53.452818 | orchestrator | 2026-04-07 00:20:53 | INFO  | It takes a moment until task 1d776026-f721-489b-9526-58ac5fd77362 (resolvconf) has been started and output is visible here. 2026-04-07 00:21:05.916490 | orchestrator | 2026-04-07 00:21:05.916626 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2026-04-07 00:21:05.916652 | orchestrator | 2026-04-07 00:21:05.916725 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:21:05.916746 | orchestrator | Tuesday 07 April 2026 00:20:56 +0000 (0:00:00.174) 0:00:00.174 ********* 2026-04-07 00:21:05.916766 | orchestrator | ok: [testbed-manager] 2026-04-07 00:21:05.916786 | orchestrator | 2026-04-07 00:21:05.916805 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2026-04-07 00:21:05.916825 | orchestrator | Tuesday 07 April 2026 00:21:00 +0000 (0:00:03.610) 0:00:03.784 ********* 2026-04-07 00:21:05.916844 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:21:05.916864 | orchestrator | 2026-04-07 00:21:05.916883 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2026-04-07 00:21:05.916902 | orchestrator | Tuesday 07 April 2026 00:21:00 +0000 (0:00:00.055) 0:00:03.840 ********* 2026-04-07 00:21:05.916922 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2026-04-07 00:21:05.916942 | orchestrator | 2026-04-07 00:21:05.916961 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2026-04-07 00:21:05.916981 | orchestrator | Tuesday 07 April 2026 00:21:00 +0000 (0:00:00.075) 0:00:03.916 ********* 2026-04-07 00:21:05.917000 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2026-04-07 00:21:05.917019 | orchestrator | 2026-04-07 00:21:05.917039 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2026-04-07 00:21:05.917058 | orchestrator | Tuesday 07 April 2026 00:21:00 +0000 (0:00:00.058) 0:00:03.974 ********* 2026-04-07 00:21:05.917077 | orchestrator | ok: [testbed-manager] 2026-04-07 00:21:05.917096 | orchestrator | 2026-04-07 00:21:05.917115 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2026-04-07 00:21:05.917134 | orchestrator | Tuesday 07 April 2026 00:21:01 +0000 (0:00:01.066) 0:00:05.041 ********* 2026-04-07 00:21:05.917153 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:21:05.917173 | orchestrator | 2026-04-07 00:21:05.917224 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2026-04-07 00:21:05.917244 | orchestrator | Tuesday 07 April 2026 00:21:01 +0000 (0:00:00.058) 0:00:05.099 ********* 2026-04-07 00:21:05.917262 | orchestrator | ok: [testbed-manager] 2026-04-07 00:21:05.917282 | orchestrator | 2026-04-07 00:21:05.917301 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2026-04-07 00:21:05.917320 | orchestrator | Tuesday 07 April 2026 00:21:01 +0000 (0:00:00.505) 0:00:05.605 ********* 2026-04-07 00:21:05.917339 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:21:05.917358 | orchestrator | 2026-04-07 00:21:05.917377 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2026-04-07 00:21:05.917397 | orchestrator | Tuesday 07 April 2026 00:21:02 +0000 (0:00:00.068) 0:00:05.673 ********* 2026-04-07 00:21:05.917416 | orchestrator | changed: [testbed-manager] 2026-04-07 00:21:05.917435 | orchestrator | 2026-04-07 00:21:05.917454 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2026-04-07 00:21:05.917473 | orchestrator | Tuesday 07 April 2026 00:21:02 +0000 (0:00:00.570) 0:00:06.243 ********* 2026-04-07 00:21:05.917492 | orchestrator | changed: [testbed-manager] 2026-04-07 00:21:05.917511 | orchestrator | 2026-04-07 00:21:05.917531 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2026-04-07 00:21:05.917550 | orchestrator | Tuesday 07 April 2026 00:21:03 +0000 (0:00:00.959) 0:00:07.203 ********* 2026-04-07 00:21:05.917570 | orchestrator | ok: [testbed-manager] 2026-04-07 00:21:05.917589 | orchestrator | 2026-04-07 00:21:05.917607 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2026-04-07 00:21:05.917626 | orchestrator | Tuesday 07 April 2026 00:21:04 +0000 (0:00:00.948) 0:00:08.152 ********* 2026-04-07 00:21:05.917645 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2026-04-07 00:21:05.917686 | orchestrator | 2026-04-07 00:21:05.917705 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2026-04-07 00:21:05.917725 | orchestrator | Tuesday 07 April 2026 00:21:04 +0000 (0:00:00.071) 0:00:08.224 ********* 2026-04-07 00:21:05.917744 | orchestrator | changed: [testbed-manager] 2026-04-07 00:21:05.917763 | orchestrator | 2026-04-07 00:21:05.917781 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:21:05.917801 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2026-04-07 00:21:05.917821 | orchestrator | 2026-04-07 00:21:05.917840 | orchestrator | 2026-04-07 00:21:05.917859 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:21:05.917878 | orchestrator | Tuesday 07 April 2026 00:21:05 +0000 (0:00:01.137) 0:00:09.362 ********* 2026-04-07 00:21:05.917897 | orchestrator | =============================================================================== 2026-04-07 00:21:05.917916 | orchestrator | Gathering Facts --------------------------------------------------------- 3.61s 2026-04-07 00:21:05.917954 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.14s 2026-04-07 00:21:05.917972 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 1.07s 2026-04-07 00:21:05.917990 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 0.96s 2026-04-07 00:21:05.918007 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 0.95s 2026-04-07 00:21:05.918111 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.57s 2026-04-07 00:21:05.918160 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 0.51s 2026-04-07 00:21:05.918181 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.08s 2026-04-07 00:21:05.918200 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.07s 2026-04-07 00:21:05.918220 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.07s 2026-04-07 00:21:05.918257 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.06s 2026-04-07 00:21:05.918278 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.06s 2026-04-07 00:21:05.918299 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.06s 2026-04-07 00:21:06.077284 | orchestrator | + osism apply sshconfig 2026-04-07 00:21:17.350420 | orchestrator | 2026-04-07 00:21:17 | INFO  | Prepare task for execution of sshconfig. 2026-04-07 00:21:17.428525 | orchestrator | 2026-04-07 00:21:17 | INFO  | Task 4358b80a-1ccb-4767-8c6c-4bfa6d828677 (sshconfig) was prepared for execution. 2026-04-07 00:21:17.428623 | orchestrator | 2026-04-07 00:21:17 | INFO  | It takes a moment until task 4358b80a-1ccb-4767-8c6c-4bfa6d828677 (sshconfig) has been started and output is visible here. 2026-04-07 00:21:28.384066 | orchestrator | 2026-04-07 00:21:28.384241 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2026-04-07 00:21:28.384271 | orchestrator | 2026-04-07 00:21:28.384293 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2026-04-07 00:21:28.384314 | orchestrator | Tuesday 07 April 2026 00:21:20 +0000 (0:00:00.184) 0:00:00.184 ********* 2026-04-07 00:21:28.384334 | orchestrator | ok: [testbed-manager] 2026-04-07 00:21:28.384355 | orchestrator | 2026-04-07 00:21:28.384375 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2026-04-07 00:21:28.384394 | orchestrator | Tuesday 07 April 2026 00:21:21 +0000 (0:00:00.925) 0:00:01.110 ********* 2026-04-07 00:21:28.384413 | orchestrator | changed: [testbed-manager] 2026-04-07 00:21:28.384434 | orchestrator | 2026-04-07 00:21:28.384454 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2026-04-07 00:21:28.384474 | orchestrator | Tuesday 07 April 2026 00:21:22 +0000 (0:00:00.531) 0:00:01.641 ********* 2026-04-07 00:21:28.384494 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2026-04-07 00:21:28.384513 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2026-04-07 00:21:28.384533 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2026-04-07 00:21:28.384552 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2026-04-07 00:21:28.384571 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2026-04-07 00:21:28.384591 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2026-04-07 00:21:28.384611 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2026-04-07 00:21:28.384631 | orchestrator | 2026-04-07 00:21:28.384650 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2026-04-07 00:21:28.384670 | orchestrator | Tuesday 07 April 2026 00:21:27 +0000 (0:00:05.550) 0:00:07.192 ********* 2026-04-07 00:21:28.384774 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:21:28.384810 | orchestrator | 2026-04-07 00:21:28.384830 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2026-04-07 00:21:28.384848 | orchestrator | Tuesday 07 April 2026 00:21:27 +0000 (0:00:00.114) 0:00:07.307 ********* 2026-04-07 00:21:28.384867 | orchestrator | changed: [testbed-manager] 2026-04-07 00:21:28.384885 | orchestrator | 2026-04-07 00:21:28.384903 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:21:28.384925 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:21:28.384945 | orchestrator | 2026-04-07 00:21:28.384965 | orchestrator | 2026-04-07 00:21:28.384985 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:21:28.385005 | orchestrator | Tuesday 07 April 2026 00:21:28 +0000 (0:00:00.508) 0:00:07.815 ********* 2026-04-07 00:21:28.385025 | orchestrator | =============================================================================== 2026-04-07 00:21:28.385044 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 5.55s 2026-04-07 00:21:28.385105 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.93s 2026-04-07 00:21:28.385125 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.53s 2026-04-07 00:21:28.385145 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.51s 2026-04-07 00:21:28.385165 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.11s 2026-04-07 00:21:28.540859 | orchestrator | + osism apply known-hosts 2026-04-07 00:21:39.923686 | orchestrator | 2026-04-07 00:21:39 | INFO  | Prepare task for execution of known-hosts. 2026-04-07 00:21:39.998472 | orchestrator | 2026-04-07 00:21:39 | INFO  | Task d76a0b5a-0672-45e6-aa8b-32de45d47e5d (known-hosts) was prepared for execution. 2026-04-07 00:21:39.998585 | orchestrator | 2026-04-07 00:21:39 | INFO  | It takes a moment until task d76a0b5a-0672-45e6-aa8b-32de45d47e5d (known-hosts) has been started and output is visible here. 2026-04-07 00:21:55.279063 | orchestrator | 2026-04-07 00:21:55.279141 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2026-04-07 00:21:55.279149 | orchestrator | 2026-04-07 00:21:55.279161 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2026-04-07 00:21:55.279166 | orchestrator | Tuesday 07 April 2026 00:21:43 +0000 (0:00:00.190) 0:00:00.190 ********* 2026-04-07 00:21:55.279172 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2026-04-07 00:21:55.279177 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2026-04-07 00:21:55.279181 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2026-04-07 00:21:55.279185 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2026-04-07 00:21:55.279189 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2026-04-07 00:21:55.279193 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2026-04-07 00:21:55.279197 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2026-04-07 00:21:55.279201 | orchestrator | 2026-04-07 00:21:55.279205 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2026-04-07 00:21:55.279209 | orchestrator | Tuesday 07 April 2026 00:21:49 +0000 (0:00:06.328) 0:00:06.518 ********* 2026-04-07 00:21:55.279214 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2026-04-07 00:21:55.279220 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2026-04-07 00:21:55.279224 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2026-04-07 00:21:55.279228 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2026-04-07 00:21:55.279232 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2026-04-07 00:21:55.279236 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2026-04-07 00:21:55.279240 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2026-04-07 00:21:55.279243 | orchestrator | 2026-04-07 00:21:55.279247 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:21:55.279251 | orchestrator | Tuesday 07 April 2026 00:21:49 +0000 (0:00:00.155) 0:00:06.673 ********* 2026-04-07 00:21:55.279255 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII56IeNausWIQcxlCzDSv63cva2Xz4FTfb2PbHECp4rr) 2026-04-07 00:21:55.279278 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQfOzw8U3apWU5vxN3E5jtsr+4ya//0+n9uQ876vF6lZew0S9PhHtZjLbIDgjp9LlNj9t1gIYbfNCqRIvxQVIz/Ips3/bjyUlRh+NFw+GywLTx+6SJANyUICRY8FQqbwQDyjjN4JrKxeYfJe3QTcRcs/Y/5mR2u3f1DiYZjSHpZ1TglBlC+nP8GoM2pBtLcncpKXzqUYTDtEo4VccFF/sz3QlV1Cs3jhLfVs4bxKgYV2vb88JZz4jrWXWNSxqV8D/7Mil4EnfR9b9BMr6/HUctSvKxDzhiRe+FeHToaJhvIUzvkTsrl0TD63yAjNidJO/G67czbYWZ52jhrOhrG+0gJ4fRbDufrM70GiCIdyTsVafEtFj20sclwm0SSW/baq4hWk5O2cRrNWspgcw9qr+nikuC+f6ZEUxMNCbn7nC2sRqQHnqnzqSuLsoquSAcZ6sTTRBbCuIYe7UgwcKpmB0ZKHvSpNsv8RNroRy86k84woHq6jyR4h5tGnFED+OWXlE=) 2026-04-07 00:21:55.279285 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD5rat9Tqj1Y7OX0enjiByTWI2aKLPYH/eM5UIlPiboSPe85deGizeUIleUCv8hQoS5JvDpjDYaEU7k8qH3pRoA=) 2026-04-07 00:21:55.279290 | orchestrator | 2026-04-07 00:21:55.279294 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:21:55.279297 | orchestrator | Tuesday 07 April 2026 00:21:50 +0000 (0:00:01.229) 0:00:07.903 ********* 2026-04-07 00:21:55.279312 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD36yRpbuRBNy8Rm7hPXwuQadhhSbu1uREp9dntqeO24AmVop1Joq3ZimIKZoD8a+Cd0tntiHsD9vVC10tMUFtbb6AQLzCw803vYWtmncWtqjoA0V8gH5Zdnzy3xMH7UZO7f9kSbnRoaDVu1TtGXXXdRPBYEp3zbyECBBgQDfWKIVFmTezgyC42cJMu5CgnxohX7ZVQTcPSEDhc95fy/Cq88zAGxMoYNvil4abiHtMt2Y6laIy74t8Wi8yKdLeVnH7rZTd1tohxGbBNcNouUKPqctIpym9O2ouOp+1S4bmt2WDHgnzrCjLK/SVajQdkiPwp3USBe5m+u8ZDShTEUTVEtSu1pHaQ65TCG2s4Ea7GHaMNWMtwx8zeOWO1CjBc4kY/z2kTd+yfcLCJGk01grjICJKFQd6dnCe2Ad7LJxW6UIVqo1YFD5g02bFkBnCCf0AG3QTXE1us0IDmWwgEw0iFeR9JM6EKwaOsm3yXNIPHZIzcTNzz0+biorXIrBeh0/8=) 2026-04-07 00:21:55.279317 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFm7QRP9PzG6T/EHnGIqVL/RcO7htBfZteBaOTXuL9Jz) 2026-04-07 00:21:55.279321 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM7dZhBTZgJZBL6JfYt5BIoTjLuO2XGiBrr6hs2+2rJ+UYURKU2YqhhgSrV0ja6jRrn/MVNssWVbkT2FtVqN/M4=) 2026-04-07 00:21:55.279325 | orchestrator | 2026-04-07 00:21:55.279329 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:21:55.279332 | orchestrator | Tuesday 07 April 2026 00:21:51 +0000 (0:00:01.074) 0:00:08.977 ********* 2026-04-07 00:21:55.279336 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN3xoouKKG0BNp6YZpAmnX8/Yj0XIOJek9+BBaaxlWRk) 2026-04-07 00:21:55.279382 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4m1WIvAqsrPd5/mBZM7nUdhXst9pYZ9f/6rFN21rJiL96r2y/5nv9HHKnXtWu6IyDjvN7g2ZuD2sD4BYibiDicICY44+7Oq5/M28d/akbYozhKrnttwAkVDxQT1fgn63c6rb/9cnvUwS6Is7cAjJQnSF3R5uiA4rKZGA0RSMXZezxtfbsM8hlBcUdAoSl5VPBN5/LZ7b25Adf43T2RRIU6EKvDcCb7xxxNAueXWcbKEE/wwjf0HsQ1R8Qc1nRyyHMWb4jrAtTbzhYq8lhd/mxEFzFxRvJhubTEP8FH/GZleR0020QNBmASwAbrXRtFuaUB8Czn8NRm4jDDXqDiRjfJele/a/xBZzFJMJkeqCDQUMgmzpBEe7OTDgRf4kcUiEAEMRqofvfG1sIMgomWWtauq6EVjbJAPhG/UNZ82cVlzPtYXiZT6CiCLwYYhoVOM9UErOpuTM/FxWFQwHvp1CHJB7VMn6ThsRxxnPVseFTzuowDg/TQj26K3Ym+IAWimU=) 2026-04-07 00:21:55.279387 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDN5PWk3AHd8+pr6XnN3xFZ1XWBc8NjlSvd5YbCKJm8KuHfMFYJI3ZRGt25kinbmelWbEUI6thVX4Nf5GmgWzII=) 2026-04-07 00:21:55.279390 | orchestrator | 2026-04-07 00:21:55.279394 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:21:55.279398 | orchestrator | Tuesday 07 April 2026 00:21:52 +0000 (0:00:01.001) 0:00:09.978 ********* 2026-04-07 00:21:55.279404 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC51KLSeht58l5LhUwxngCQpd51u8xHwvpIjJVd/bK0F5FRv6n0iryicPoAj3OaEr7i1zgXnO+izIBUN5wOBWdqO30gYVgFOh/mCrg5WvVt2nTLeM5Hx2wIhI3r1oxxMLeUxK2UBAFXWRNN1ZPfLOHS2i6fchZtyp72Az4Lp41YMmfzhJcVzoOHjQqZmiU5AGeWICGkO4PGkPYvi5NTbhFJDkOrXJ5HS8dt7IMdORlTEC3xhltkJ78ubu9lN6OZozyNMjSmbn3wcXtV4SyFdCcBlDQL5LVL0cPNj5Rw/YCQRsjy0eGIEnwBoBubouh6GTEoqW1ET08qcy1fqmiRbRH7NuYGiHoI59I1/zUgdYSI/KxUxaTtvs/0jM0GBPzLm9s1YxsNTh/IWLh/mMKz3Cn4OFw65Q3Mgyn8KhM0AQTEsrmmIZFTd5ppdzfpMrJOuuzqGcoy3jtRfiHYRrB/KFgt7TAZk8lC2Oev2h31aG9IvulzW0YWSz3e1JzrnaBf6mU=) 2026-04-07 00:21:55.279412 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHhm+O38SBA0V2lPk2i98wa7qzXeSUocWb21EXnISmaGxd/GoQtCnhaZgoM55JMt8j5YoASwxd7NsMOwdbE5s2E=) 2026-04-07 00:21:55.279416 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGfMogF7QmaUfRXTwgcnREFPBtawcIg3IfQJeRVTU1jP) 2026-04-07 00:21:55.279420 | orchestrator | 2026-04-07 00:21:55.279424 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:21:55.279428 | orchestrator | Tuesday 07 April 2026 00:21:53 +0000 (0:00:01.014) 0:00:10.992 ********* 2026-04-07 00:21:55.279432 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdqNr7JW/Uz7Fj7uuN11hf0QxHRLuuu6F4zsLraBt83e/oyvI0dYmi3HrxuG94hyi9Ji0LCPbjj+NRXSBV1OTz1nI+j49htc/XTvmRtNo3d7nP5KHbHSFQlwCTg8wbY1dAZ4SbPmu/lOJjyXZlj3f3eK15Hem1Oxqdy8R9DCFtpP7J7+aLC4PoTxkIWYP9bdIudnlC8yMuuWJWDyOg58wIK8NUpJi24QhKjm8eNRAqMxCBiWR7ST3m9wtVILEEni55adVlXUn1X3k/V9CUCbAJL/yP5ooso/WAsqX9PLZwnwRZJYjPs/0hGPrTHyq9oa3auqDv1pPHSoOTiqId4qSs4GsbjFUkfRLAIXQzCPBojk2R2epsdCXV1+HocvF0rlv2DoqxAGT/KoIBCpWhEs4AH1WS2cU/FWaCf0IFBp2TdbpjA/fXXQNwHyKoAtReSgSZpekclgmyUgbZvuYvKRfSb1gOkPvuF1vmJVQR/34gvmw4Nnty1d4xmePexoC5yM0=) 2026-04-07 00:21:55.279435 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMgrctqlXpKHMIB9ocAxSr0dwLnsvUDM5Z3mbrCW7Yb/wyjSUmsbSJF5HLtGCNPq1nsizRRlTL78+8EiHB640Cw=) 2026-04-07 00:21:55.279439 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKiOy1YhlNugvcsISTS6rVBLZrMKPWJ2LxWFrQtpkWQX) 2026-04-07 00:21:55.279443 | orchestrator | 2026-04-07 00:21:55.279447 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:21:55.279451 | orchestrator | Tuesday 07 April 2026 00:21:54 +0000 (0:00:01.007) 0:00:12.000 ********* 2026-04-07 00:21:55.279458 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrtRuHP22VPuPz5OyVNpG6DkDeOtd1+fs40gLrM/HB63m+8dWmGnj2AOcPlqCRlAGOsbVwNCa7R6RHUEr8Jn6qshFAN/9OJu17RyolePejXb0jKC7gywVu4ATrpo2g0MhVKAKHXfv0ghJZPpp1uvV+1zFbQK5hCzF3pyniPDcP6hYNfFnlIrycIsIlfloC2gvvPJuMpVBiMh4HeDyHVtgKWReEslik3jRAl1/A6iCyCYgaTMRyhkgHPL+l2l/Pf7cfjq4jneHiaOAWTi82LBijw0uykmuGRQSBP2ZjAykLmTVcuewdRK/KvoMGQrEET4sfxYuVOSXV2qGSrC7aZTUfLyDDNmqzKVwDJylDxzW48F4Jp0VKahHlEwDeNY+I1LztV4QdJS2AkO6sn6uYClBohjz840q9dfwmnRBmyTA5wFs+YBpAz7ehFFZI66vLi3b1AK6f2x28f6TCP97vVVSSshwIg3n8p8eholaOluHXszhVnvvNzLZB42myOm3+8B0=) 2026-04-07 00:22:06.387325 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGQ8zmi+FI9+5PQvIGdrEj88/DQdefQCxL7IIb3EjSAvzsTaWYBoHGwGcL9IuK4U67Y74dGLxhdgnj3TtUarnlg=) 2026-04-07 00:22:06.387428 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFT+BKr11xRxnddKO9voUABE3rwoNy7qiM9x0Z6B+Dc5) 2026-04-07 00:22:06.387444 | orchestrator | 2026-04-07 00:22:06.387455 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:06.387466 | orchestrator | Tuesday 07 April 2026 00:21:55 +0000 (0:00:01.045) 0:00:13.045 ********* 2026-04-07 00:22:06.387477 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICiGRUa9o/mfArWxBQbH5wURLfVaMZKBgUizCHDtNg7w) 2026-04-07 00:22:06.387489 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDV4pWOKwvKIfkXmPufeNdR66vaGkeXi7SQ25CD+u51aXVjMxw1bLtfkjgIeEize0RABJz28FLuBVLC6yE83pvm4MBYaVPMCp67wL9oc1ilBaPdV93SDv1JZ9qC4TVlUXFEwwetW3JFZ0ovtQsL2WOvOHbj40Z2RyB79L9M/bGJOmlkp/DywaDC2MdEIL1ZZBi16GNoJmnz9sZc3N5sizqEseN2v4GVmjX1ZbED1hN5boc/gZsiSkL+LmHMqlOaQPvOKZuE3tACNmNuPqGhqAuHTJfaonrGgc5CAf5Q2Y7Br1xaz8qjIURYsz8qWI49ftSEVh8zWjlsUaUD9cBsn6ERQAIOYCwaPypOaLnkaMKL6BDNifDLDnej2DCHiJ9fZbwUxujfcRpSSQ3EEYIAH8oXQRLxD/qb+Uy9BbIQsdtNaH9CZJ5mpb5eNTYFGoT8cDORvLp7j7Ahm4mOeVWwNHI7Z6nqVgKXonptPrR/lMwTvQruYRPvlOlheBkVLcWGu9k=) 2026-04-07 00:22:06.387520 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBInTeTqLGXpjyfGjuO4vhktDbh+MsUPh+A/HQRiVFD5FEXDDzG7PrhOhYljrgqYq4Wfu8E3+/ToeRDUy0EFLdgk=) 2026-04-07 00:22:06.387530 | orchestrator | 2026-04-07 00:22:06.387541 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2026-04-07 00:22:06.387551 | orchestrator | Tuesday 07 April 2026 00:21:56 +0000 (0:00:01.033) 0:00:14.079 ********* 2026-04-07 00:22:06.387562 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2026-04-07 00:22:06.387571 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2026-04-07 00:22:06.387581 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2026-04-07 00:22:06.387591 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2026-04-07 00:22:06.387600 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2026-04-07 00:22:06.387610 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2026-04-07 00:22:06.387619 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2026-04-07 00:22:06.387629 | orchestrator | 2026-04-07 00:22:06.387638 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2026-04-07 00:22:06.387657 | orchestrator | Tuesday 07 April 2026 00:22:02 +0000 (0:00:05.225) 0:00:19.304 ********* 2026-04-07 00:22:06.387668 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2026-04-07 00:22:06.387680 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2026-04-07 00:22:06.387690 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2026-04-07 00:22:06.387700 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2026-04-07 00:22:06.387709 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2026-04-07 00:22:06.387719 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2026-04-07 00:22:06.387729 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2026-04-07 00:22:06.387764 | orchestrator | 2026-04-07 00:22:06.387774 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:06.387784 | orchestrator | Tuesday 07 April 2026 00:22:02 +0000 (0:00:00.166) 0:00:19.471 ********* 2026-04-07 00:22:06.387794 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD5rat9Tqj1Y7OX0enjiByTWI2aKLPYH/eM5UIlPiboSPe85deGizeUIleUCv8hQoS5JvDpjDYaEU7k8qH3pRoA=) 2026-04-07 00:22:06.387823 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQfOzw8U3apWU5vxN3E5jtsr+4ya//0+n9uQ876vF6lZew0S9PhHtZjLbIDgjp9LlNj9t1gIYbfNCqRIvxQVIz/Ips3/bjyUlRh+NFw+GywLTx+6SJANyUICRY8FQqbwQDyjjN4JrKxeYfJe3QTcRcs/Y/5mR2u3f1DiYZjSHpZ1TglBlC+nP8GoM2pBtLcncpKXzqUYTDtEo4VccFF/sz3QlV1Cs3jhLfVs4bxKgYV2vb88JZz4jrWXWNSxqV8D/7Mil4EnfR9b9BMr6/HUctSvKxDzhiRe+FeHToaJhvIUzvkTsrl0TD63yAjNidJO/G67czbYWZ52jhrOhrG+0gJ4fRbDufrM70GiCIdyTsVafEtFj20sclwm0SSW/baq4hWk5O2cRrNWspgcw9qr+nikuC+f6ZEUxMNCbn7nC2sRqQHnqnzqSuLsoquSAcZ6sTTRBbCuIYe7UgwcKpmB0ZKHvSpNsv8RNroRy86k84woHq6jyR4h5tGnFED+OWXlE=) 2026-04-07 00:22:06.387841 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII56IeNausWIQcxlCzDSv63cva2Xz4FTfb2PbHECp4rr) 2026-04-07 00:22:06.387851 | orchestrator | 2026-04-07 00:22:06.387860 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:06.387870 | orchestrator | Tuesday 07 April 2026 00:22:03 +0000 (0:00:01.011) 0:00:20.483 ********* 2026-04-07 00:22:06.387880 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFm7QRP9PzG6T/EHnGIqVL/RcO7htBfZteBaOTXuL9Jz) 2026-04-07 00:22:06.387891 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD36yRpbuRBNy8Rm7hPXwuQadhhSbu1uREp9dntqeO24AmVop1Joq3ZimIKZoD8a+Cd0tntiHsD9vVC10tMUFtbb6AQLzCw803vYWtmncWtqjoA0V8gH5Zdnzy3xMH7UZO7f9kSbnRoaDVu1TtGXXXdRPBYEp3zbyECBBgQDfWKIVFmTezgyC42cJMu5CgnxohX7ZVQTcPSEDhc95fy/Cq88zAGxMoYNvil4abiHtMt2Y6laIy74t8Wi8yKdLeVnH7rZTd1tohxGbBNcNouUKPqctIpym9O2ouOp+1S4bmt2WDHgnzrCjLK/SVajQdkiPwp3USBe5m+u8ZDShTEUTVEtSu1pHaQ65TCG2s4Ea7GHaMNWMtwx8zeOWO1CjBc4kY/z2kTd+yfcLCJGk01grjICJKFQd6dnCe2Ad7LJxW6UIVqo1YFD5g02bFkBnCCf0AG3QTXE1us0IDmWwgEw0iFeR9JM6EKwaOsm3yXNIPHZIzcTNzz0+biorXIrBeh0/8=) 2026-04-07 00:22:06.387901 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM7dZhBTZgJZBL6JfYt5BIoTjLuO2XGiBrr6hs2+2rJ+UYURKU2YqhhgSrV0ja6jRrn/MVNssWVbkT2FtVqN/M4=) 2026-04-07 00:22:06.387911 | orchestrator | 2026-04-07 00:22:06.387921 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:06.387931 | orchestrator | Tuesday 07 April 2026 00:22:04 +0000 (0:00:01.037) 0:00:21.521 ********* 2026-04-07 00:22:06.387941 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4m1WIvAqsrPd5/mBZM7nUdhXst9pYZ9f/6rFN21rJiL96r2y/5nv9HHKnXtWu6IyDjvN7g2ZuD2sD4BYibiDicICY44+7Oq5/M28d/akbYozhKrnttwAkVDxQT1fgn63c6rb/9cnvUwS6Is7cAjJQnSF3R5uiA4rKZGA0RSMXZezxtfbsM8hlBcUdAoSl5VPBN5/LZ7b25Adf43T2RRIU6EKvDcCb7xxxNAueXWcbKEE/wwjf0HsQ1R8Qc1nRyyHMWb4jrAtTbzhYq8lhd/mxEFzFxRvJhubTEP8FH/GZleR0020QNBmASwAbrXRtFuaUB8Czn8NRm4jDDXqDiRjfJele/a/xBZzFJMJkeqCDQUMgmzpBEe7OTDgRf4kcUiEAEMRqofvfG1sIMgomWWtauq6EVjbJAPhG/UNZ82cVlzPtYXiZT6CiCLwYYhoVOM9UErOpuTM/FxWFQwHvp1CHJB7VMn6ThsRxxnPVseFTzuowDg/TQj26K3Ym+IAWimU=) 2026-04-07 00:22:06.387951 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDN5PWk3AHd8+pr6XnN3xFZ1XWBc8NjlSvd5YbCKJm8KuHfMFYJI3ZRGt25kinbmelWbEUI6thVX4Nf5GmgWzII=) 2026-04-07 00:22:06.387961 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN3xoouKKG0BNp6YZpAmnX8/Yj0XIOJek9+BBaaxlWRk) 2026-04-07 00:22:06.387971 | orchestrator | 2026-04-07 00:22:06.387981 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:06.387990 | orchestrator | Tuesday 07 April 2026 00:22:05 +0000 (0:00:00.990) 0:00:22.511 ********* 2026-04-07 00:22:06.388000 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC51KLSeht58l5LhUwxngCQpd51u8xHwvpIjJVd/bK0F5FRv6n0iryicPoAj3OaEr7i1zgXnO+izIBUN5wOBWdqO30gYVgFOh/mCrg5WvVt2nTLeM5Hx2wIhI3r1oxxMLeUxK2UBAFXWRNN1ZPfLOHS2i6fchZtyp72Az4Lp41YMmfzhJcVzoOHjQqZmiU5AGeWICGkO4PGkPYvi5NTbhFJDkOrXJ5HS8dt7IMdORlTEC3xhltkJ78ubu9lN6OZozyNMjSmbn3wcXtV4SyFdCcBlDQL5LVL0cPNj5Rw/YCQRsjy0eGIEnwBoBubouh6GTEoqW1ET08qcy1fqmiRbRH7NuYGiHoI59I1/zUgdYSI/KxUxaTtvs/0jM0GBPzLm9s1YxsNTh/IWLh/mMKz3Cn4OFw65Q3Mgyn8KhM0AQTEsrmmIZFTd5ppdzfpMrJOuuzqGcoy3jtRfiHYRrB/KFgt7TAZk8lC2Oev2h31aG9IvulzW0YWSz3e1JzrnaBf6mU=) 2026-04-07 00:22:06.388015 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHhm+O38SBA0V2lPk2i98wa7qzXeSUocWb21EXnISmaGxd/GoQtCnhaZgoM55JMt8j5YoASwxd7NsMOwdbE5s2E=) 2026-04-07 00:22:06.388037 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGfMogF7QmaUfRXTwgcnREFPBtawcIg3IfQJeRVTU1jP) 2026-04-07 00:22:10.403365 | orchestrator | 2026-04-07 00:22:10.403471 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:10.403488 | orchestrator | Tuesday 07 April 2026 00:22:06 +0000 (0:00:01.000) 0:00:23.512 ********* 2026-04-07 00:22:10.403521 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMgrctqlXpKHMIB9ocAxSr0dwLnsvUDM5Z3mbrCW7Yb/wyjSUmsbSJF5HLtGCNPq1nsizRRlTL78+8EiHB640Cw=) 2026-04-07 00:22:10.403538 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdqNr7JW/Uz7Fj7uuN11hf0QxHRLuuu6F4zsLraBt83e/oyvI0dYmi3HrxuG94hyi9Ji0LCPbjj+NRXSBV1OTz1nI+j49htc/XTvmRtNo3d7nP5KHbHSFQlwCTg8wbY1dAZ4SbPmu/lOJjyXZlj3f3eK15Hem1Oxqdy8R9DCFtpP7J7+aLC4PoTxkIWYP9bdIudnlC8yMuuWJWDyOg58wIK8NUpJi24QhKjm8eNRAqMxCBiWR7ST3m9wtVILEEni55adVlXUn1X3k/V9CUCbAJL/yP5ooso/WAsqX9PLZwnwRZJYjPs/0hGPrTHyq9oa3auqDv1pPHSoOTiqId4qSs4GsbjFUkfRLAIXQzCPBojk2R2epsdCXV1+HocvF0rlv2DoqxAGT/KoIBCpWhEs4AH1WS2cU/FWaCf0IFBp2TdbpjA/fXXQNwHyKoAtReSgSZpekclgmyUgbZvuYvKRfSb1gOkPvuF1vmJVQR/34gvmw4Nnty1d4xmePexoC5yM0=) 2026-04-07 00:22:10.403553 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKiOy1YhlNugvcsISTS6rVBLZrMKPWJ2LxWFrQtpkWQX) 2026-04-07 00:22:10.403565 | orchestrator | 2026-04-07 00:22:10.403576 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:10.403587 | orchestrator | Tuesday 07 April 2026 00:22:07 +0000 (0:00:00.990) 0:00:24.503 ********* 2026-04-07 00:22:10.403603 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGQ8zmi+FI9+5PQvIGdrEj88/DQdefQCxL7IIb3EjSAvzsTaWYBoHGwGcL9IuK4U67Y74dGLxhdgnj3TtUarnlg=) 2026-04-07 00:22:10.403615 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrtRuHP22VPuPz5OyVNpG6DkDeOtd1+fs40gLrM/HB63m+8dWmGnj2AOcPlqCRlAGOsbVwNCa7R6RHUEr8Jn6qshFAN/9OJu17RyolePejXb0jKC7gywVu4ATrpo2g0MhVKAKHXfv0ghJZPpp1uvV+1zFbQK5hCzF3pyniPDcP6hYNfFnlIrycIsIlfloC2gvvPJuMpVBiMh4HeDyHVtgKWReEslik3jRAl1/A6iCyCYgaTMRyhkgHPL+l2l/Pf7cfjq4jneHiaOAWTi82LBijw0uykmuGRQSBP2ZjAykLmTVcuewdRK/KvoMGQrEET4sfxYuVOSXV2qGSrC7aZTUfLyDDNmqzKVwDJylDxzW48F4Jp0VKahHlEwDeNY+I1LztV4QdJS2AkO6sn6uYClBohjz840q9dfwmnRBmyTA5wFs+YBpAz7ehFFZI66vLi3b1AK6f2x28f6TCP97vVVSSshwIg3n8p8eholaOluHXszhVnvvNzLZB42myOm3+8B0=) 2026-04-07 00:22:10.403627 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFT+BKr11xRxnddKO9voUABE3rwoNy7qiM9x0Z6B+Dc5) 2026-04-07 00:22:10.403637 | orchestrator | 2026-04-07 00:22:10.403649 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-07 00:22:10.403660 | orchestrator | Tuesday 07 April 2026 00:22:08 +0000 (0:00:01.017) 0:00:25.521 ********* 2026-04-07 00:22:10.403671 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICiGRUa9o/mfArWxBQbH5wURLfVaMZKBgUizCHDtNg7w) 2026-04-07 00:22:10.403683 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDV4pWOKwvKIfkXmPufeNdR66vaGkeXi7SQ25CD+u51aXVjMxw1bLtfkjgIeEize0RABJz28FLuBVLC6yE83pvm4MBYaVPMCp67wL9oc1ilBaPdV93SDv1JZ9qC4TVlUXFEwwetW3JFZ0ovtQsL2WOvOHbj40Z2RyB79L9M/bGJOmlkp/DywaDC2MdEIL1ZZBi16GNoJmnz9sZc3N5sizqEseN2v4GVmjX1ZbED1hN5boc/gZsiSkL+LmHMqlOaQPvOKZuE3tACNmNuPqGhqAuHTJfaonrGgc5CAf5Q2Y7Br1xaz8qjIURYsz8qWI49ftSEVh8zWjlsUaUD9cBsn6ERQAIOYCwaPypOaLnkaMKL6BDNifDLDnej2DCHiJ9fZbwUxujfcRpSSQ3EEYIAH8oXQRLxD/qb+Uy9BbIQsdtNaH9CZJ5mpb5eNTYFGoT8cDORvLp7j7Ahm4mOeVWwNHI7Z6nqVgKXonptPrR/lMwTvQruYRPvlOlheBkVLcWGu9k=) 2026-04-07 00:22:10.403715 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBInTeTqLGXpjyfGjuO4vhktDbh+MsUPh+A/HQRiVFD5FEXDDzG7PrhOhYljrgqYq4Wfu8E3+/ToeRDUy0EFLdgk=) 2026-04-07 00:22:10.403727 | orchestrator | 2026-04-07 00:22:10.403817 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2026-04-07 00:22:10.403830 | orchestrator | Tuesday 07 April 2026 00:22:09 +0000 (0:00:01.038) 0:00:26.559 ********* 2026-04-07 00:22:10.403842 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2026-04-07 00:22:10.403853 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-04-07 00:22:10.403864 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2026-04-07 00:22:10.403875 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2026-04-07 00:22:10.403885 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2026-04-07 00:22:10.403896 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2026-04-07 00:22:10.403910 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2026-04-07 00:22:10.403923 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:22:10.403937 | orchestrator | 2026-04-07 00:22:10.403966 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2026-04-07 00:22:10.403980 | orchestrator | Tuesday 07 April 2026 00:22:09 +0000 (0:00:00.167) 0:00:26.727 ********* 2026-04-07 00:22:10.403993 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:22:10.404006 | orchestrator | 2026-04-07 00:22:10.404019 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2026-04-07 00:22:10.404032 | orchestrator | Tuesday 07 April 2026 00:22:09 +0000 (0:00:00.038) 0:00:26.766 ********* 2026-04-07 00:22:10.404045 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:22:10.404058 | orchestrator | 2026-04-07 00:22:10.404070 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2026-04-07 00:22:10.404082 | orchestrator | Tuesday 07 April 2026 00:22:09 +0000 (0:00:00.051) 0:00:26.818 ********* 2026-04-07 00:22:10.404095 | orchestrator | changed: [testbed-manager] 2026-04-07 00:22:10.404107 | orchestrator | 2026-04-07 00:22:10.404119 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:22:10.404132 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2026-04-07 00:22:10.404147 | orchestrator | 2026-04-07 00:22:10.404160 | orchestrator | 2026-04-07 00:22:10.404172 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:22:10.404185 | orchestrator | Tuesday 07 April 2026 00:22:10 +0000 (0:00:00.489) 0:00:27.307 ********* 2026-04-07 00:22:10.404223 | orchestrator | =============================================================================== 2026-04-07 00:22:10.404237 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 6.33s 2026-04-07 00:22:10.404251 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.23s 2026-04-07 00:22:10.404263 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.23s 2026-04-07 00:22:10.404274 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.07s 2026-04-07 00:22:10.404285 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2026-04-07 00:22:10.404296 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2026-04-07 00:22:10.404306 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2026-04-07 00:22:10.404317 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.03s 2026-04-07 00:22:10.404328 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.02s 2026-04-07 00:22:10.404339 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.01s 2026-04-07 00:22:10.404362 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.01s 2026-04-07 00:22:10.404373 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.01s 2026-04-07 00:22:10.404384 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.00s 2026-04-07 00:22:10.404395 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.00s 2026-04-07 00:22:10.404405 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 0.99s 2026-04-07 00:22:10.404416 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 0.99s 2026-04-07 00:22:10.404502 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.49s 2026-04-07 00:22:10.404514 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.17s 2026-04-07 00:22:10.404524 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.17s 2026-04-07 00:22:10.404536 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.16s 2026-04-07 00:22:10.574271 | orchestrator | + osism apply squid 2026-04-07 00:22:21.827269 | orchestrator | 2026-04-07 00:22:21 | INFO  | Prepare task for execution of squid. 2026-04-07 00:22:21.901241 | orchestrator | 2026-04-07 00:22:21 | INFO  | Task edee631d-9704-4948-bd38-fee820d29746 (squid) was prepared for execution. 2026-04-07 00:22:21.901330 | orchestrator | 2026-04-07 00:22:21 | INFO  | It takes a moment until task edee631d-9704-4948-bd38-fee820d29746 (squid) has been started and output is visible here. 2026-04-07 00:24:17.845942 | orchestrator | 2026-04-07 00:24:17.846145 | orchestrator | PLAY [Apply role squid] ******************************************************** 2026-04-07 00:24:17.846170 | orchestrator | 2026-04-07 00:24:17.846205 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2026-04-07 00:24:17.846218 | orchestrator | Tuesday 07 April 2026 00:22:24 +0000 (0:00:00.196) 0:00:00.196 ********* 2026-04-07 00:24:17.846230 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2026-04-07 00:24:17.846242 | orchestrator | 2026-04-07 00:24:17.846253 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2026-04-07 00:24:17.846264 | orchestrator | Tuesday 07 April 2026 00:22:25 +0000 (0:00:00.086) 0:00:00.283 ********* 2026-04-07 00:24:17.846275 | orchestrator | ok: [testbed-manager] 2026-04-07 00:24:17.846287 | orchestrator | 2026-04-07 00:24:17.846298 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2026-04-07 00:24:17.846309 | orchestrator | Tuesday 07 April 2026 00:22:27 +0000 (0:00:02.235) 0:00:02.518 ********* 2026-04-07 00:24:17.846321 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2026-04-07 00:24:17.846332 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2026-04-07 00:24:17.846343 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2026-04-07 00:24:17.846354 | orchestrator | 2026-04-07 00:24:17.846365 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2026-04-07 00:24:17.846376 | orchestrator | Tuesday 07 April 2026 00:22:28 +0000 (0:00:01.192) 0:00:03.710 ********* 2026-04-07 00:24:17.846387 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2026-04-07 00:24:17.846398 | orchestrator | 2026-04-07 00:24:17.846409 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2026-04-07 00:24:17.846420 | orchestrator | Tuesday 07 April 2026 00:22:29 +0000 (0:00:01.006) 0:00:04.717 ********* 2026-04-07 00:24:17.846433 | orchestrator | ok: [testbed-manager] 2026-04-07 00:24:17.846447 | orchestrator | 2026-04-07 00:24:17.846459 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2026-04-07 00:24:17.846472 | orchestrator | Tuesday 07 April 2026 00:22:29 +0000 (0:00:00.324) 0:00:05.042 ********* 2026-04-07 00:24:17.846491 | orchestrator | changed: [testbed-manager] 2026-04-07 00:24:17.846543 | orchestrator | 2026-04-07 00:24:17.846568 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2026-04-07 00:24:17.846582 | orchestrator | Tuesday 07 April 2026 00:22:30 +0000 (0:00:00.864) 0:00:05.906 ********* 2026-04-07 00:24:17.846595 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2026-04-07 00:24:17.846608 | orchestrator | ok: [testbed-manager] 2026-04-07 00:24:17.846620 | orchestrator | 2026-04-07 00:24:17.846635 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2026-04-07 00:24:17.846648 | orchestrator | Tuesday 07 April 2026 00:23:04 +0000 (0:00:34.208) 0:00:40.115 ********* 2026-04-07 00:24:17.846659 | orchestrator | changed: [testbed-manager] 2026-04-07 00:24:17.846670 | orchestrator | 2026-04-07 00:24:17.846680 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2026-04-07 00:24:17.846691 | orchestrator | Tuesday 07 April 2026 00:23:16 +0000 (0:00:12.018) 0:00:52.133 ********* 2026-04-07 00:24:17.846702 | orchestrator | Pausing for 60 seconds 2026-04-07 00:24:17.846713 | orchestrator | changed: [testbed-manager] 2026-04-07 00:24:17.846724 | orchestrator | 2026-04-07 00:24:17.846735 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2026-04-07 00:24:17.846752 | orchestrator | Tuesday 07 April 2026 00:24:17 +0000 (0:01:00.077) 0:01:52.210 ********* 2026-04-07 00:24:17.846763 | orchestrator | ok: [testbed-manager] 2026-04-07 00:24:17.846774 | orchestrator | 2026-04-07 00:24:17.846784 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2026-04-07 00:24:17.846796 | orchestrator | Tuesday 07 April 2026 00:24:17 +0000 (0:00:00.057) 0:01:52.268 ********* 2026-04-07 00:24:17.846806 | orchestrator | changed: [testbed-manager] 2026-04-07 00:24:17.846817 | orchestrator | 2026-04-07 00:24:17.846864 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:24:17.846876 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:24:17.846887 | orchestrator | 2026-04-07 00:24:17.846898 | orchestrator | 2026-04-07 00:24:17.846909 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:24:17.846920 | orchestrator | Tuesday 07 April 2026 00:24:17 +0000 (0:00:00.598) 0:01:52.866 ********* 2026-04-07 00:24:17.846930 | orchestrator | =============================================================================== 2026-04-07 00:24:17.846941 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.08s 2026-04-07 00:24:17.846952 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 34.21s 2026-04-07 00:24:17.846963 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.02s 2026-04-07 00:24:17.846973 | orchestrator | osism.services.squid : Install required packages ------------------------ 2.24s 2026-04-07 00:24:17.846984 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.19s 2026-04-07 00:24:17.846995 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.01s 2026-04-07 00:24:17.847006 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 0.87s 2026-04-07 00:24:17.847025 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.60s 2026-04-07 00:24:17.847044 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.32s 2026-04-07 00:24:17.847063 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.09s 2026-04-07 00:24:17.847083 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.06s 2026-04-07 00:24:17.975434 | orchestrator | + [[ 10.0.0 != \l\a\t\e\s\t ]] 2026-04-07 00:24:17.975554 | orchestrator | ++ semver 10.0.0 10.0.0-0 2026-04-07 00:24:18.046213 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-07 00:24:18.046315 | orchestrator | + /opt/configuration/scripts/set-kolla-namespace.sh kolla/release/ 2026-04-07 00:24:18.050766 | orchestrator | + set -e 2026-04-07 00:24:18.050815 | orchestrator | + NAMESPACE=kolla/release/ 2026-04-07 00:24:18.050857 | orchestrator | + sed -i 's#docker_namespace: .*#docker_namespace: kolla/release/#g' /opt/configuration/inventory/group_vars/all/kolla.yml 2026-04-07 00:24:18.056988 | orchestrator | ++ semver 10.0.0 9.0.0 2026-04-07 00:24:18.106579 | orchestrator | + [[ 1 -lt 0 ]] 2026-04-07 00:24:18.107141 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2026-04-07 00:24:29.430875 | orchestrator | 2026-04-07 00:24:29 | INFO  | Prepare task for execution of operator. 2026-04-07 00:24:29.500013 | orchestrator | 2026-04-07 00:24:29 | INFO  | Task 5f3d1650-dffc-4c5d-be59-f2c4f6da730b (operator) was prepared for execution. 2026-04-07 00:24:29.500119 | orchestrator | 2026-04-07 00:24:29 | INFO  | It takes a moment until task 5f3d1650-dffc-4c5d-be59-f2c4f6da730b (operator) has been started and output is visible here. 2026-04-07 00:24:44.604764 | orchestrator | 2026-04-07 00:24:44.604931 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2026-04-07 00:24:44.604950 | orchestrator | 2026-04-07 00:24:44.604962 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-07 00:24:44.604988 | orchestrator | Tuesday 07 April 2026 00:24:32 +0000 (0:00:00.182) 0:00:00.182 ********* 2026-04-07 00:24:44.605066 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:24:44.605082 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:24:44.605093 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:24:44.605105 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:24:44.605116 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:24:44.605126 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:24:44.605137 | orchestrator | 2026-04-07 00:24:44.605149 | orchestrator | TASK [Do not require tty for all users] **************************************** 2026-04-07 00:24:44.605160 | orchestrator | Tuesday 07 April 2026 00:24:36 +0000 (0:00:03.399) 0:00:03.581 ********* 2026-04-07 00:24:44.605171 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:24:44.605182 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:24:44.605193 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:24:44.605203 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:24:44.605214 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:24:44.605225 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:24:44.605236 | orchestrator | 2026-04-07 00:24:44.605247 | orchestrator | PLAY [Apply role operator] ***************************************************** 2026-04-07 00:24:44.605258 | orchestrator | 2026-04-07 00:24:44.605269 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2026-04-07 00:24:44.605289 | orchestrator | Tuesday 07 April 2026 00:24:36 +0000 (0:00:00.859) 0:00:04.440 ********* 2026-04-07 00:24:44.605315 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:24:44.605342 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:24:44.605362 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:24:44.605381 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:24:44.605401 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:24:44.605421 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:24:44.605440 | orchestrator | 2026-04-07 00:24:44.605461 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2026-04-07 00:24:44.605482 | orchestrator | Tuesday 07 April 2026 00:24:37 +0000 (0:00:00.150) 0:00:04.591 ********* 2026-04-07 00:24:44.605502 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:24:44.605522 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:24:44.605543 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:24:44.605563 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:24:44.605583 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:24:44.605604 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:24:44.605625 | orchestrator | 2026-04-07 00:24:44.605644 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2026-04-07 00:24:44.605659 | orchestrator | Tuesday 07 April 2026 00:24:37 +0000 (0:00:00.140) 0:00:04.731 ********* 2026-04-07 00:24:44.605672 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:24:44.605686 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:24:44.605697 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:24:44.605708 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:24:44.605719 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:24:44.605753 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:24:44.605765 | orchestrator | 2026-04-07 00:24:44.605776 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2026-04-07 00:24:44.605787 | orchestrator | Tuesday 07 April 2026 00:24:37 +0000 (0:00:00.657) 0:00:05.389 ********* 2026-04-07 00:24:44.605798 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:24:44.605809 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:24:44.605820 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:24:44.605884 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:24:44.605904 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:24:44.605921 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:24:44.605939 | orchestrator | 2026-04-07 00:24:44.605953 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2026-04-07 00:24:44.605964 | orchestrator | Tuesday 07 April 2026 00:24:38 +0000 (0:00:00.897) 0:00:06.286 ********* 2026-04-07 00:24:44.605975 | orchestrator | changed: [testbed-node-0] => (item=adm) 2026-04-07 00:24:44.605986 | orchestrator | changed: [testbed-node-1] => (item=adm) 2026-04-07 00:24:44.605997 | orchestrator | changed: [testbed-node-2] => (item=adm) 2026-04-07 00:24:44.606007 | orchestrator | changed: [testbed-node-3] => (item=adm) 2026-04-07 00:24:44.606095 | orchestrator | changed: [testbed-node-4] => (item=adm) 2026-04-07 00:24:44.606110 | orchestrator | changed: [testbed-node-5] => (item=adm) 2026-04-07 00:24:44.606121 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2026-04-07 00:24:44.606131 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2026-04-07 00:24:44.606142 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2026-04-07 00:24:44.606153 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2026-04-07 00:24:44.606164 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2026-04-07 00:24:44.606174 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2026-04-07 00:24:44.606185 | orchestrator | 2026-04-07 00:24:44.606196 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2026-04-07 00:24:44.606220 | orchestrator | Tuesday 07 April 2026 00:24:40 +0000 (0:00:01.344) 0:00:07.631 ********* 2026-04-07 00:24:44.606232 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:24:44.606243 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:24:44.606254 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:24:44.606265 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:24:44.606276 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:24:44.606287 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:24:44.606299 | orchestrator | 2026-04-07 00:24:44.606318 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2026-04-07 00:24:44.606340 | orchestrator | Tuesday 07 April 2026 00:24:41 +0000 (0:00:01.312) 0:00:08.944 ********* 2026-04-07 00:24:44.606366 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:24:44.606384 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:24:44.606402 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:24:44.606420 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:24:44.606439 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:24:44.606484 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2026-04-07 00:24:44.606497 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2026-04-07 00:24:44.606508 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2026-04-07 00:24:44.606519 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2026-04-07 00:24:44.606529 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2026-04-07 00:24:44.606540 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2026-04-07 00:24:44.606551 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2026-04-07 00:24:44.606561 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:24:44.606584 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2026-04-07 00:24:44.606595 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2026-04-07 00:24:44.606607 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2026-04-07 00:24:44.606617 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:24:44.606628 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:24:44.606639 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:24:44.606649 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:24:44.606660 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2026-04-07 00:24:44.606671 | orchestrator | 2026-04-07 00:24:44.606681 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2026-04-07 00:24:44.606693 | orchestrator | Tuesday 07 April 2026 00:24:42 +0000 (0:00:01.216) 0:00:10.160 ********* 2026-04-07 00:24:44.606704 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:44.606715 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:44.606726 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:44.606737 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:44.606748 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:44.606758 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:44.606769 | orchestrator | 2026-04-07 00:24:44.606782 | orchestrator | TASK [osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file] *** 2026-04-07 00:24:44.606801 | orchestrator | Tuesday 07 April 2026 00:24:42 +0000 (0:00:00.167) 0:00:10.327 ********* 2026-04-07 00:24:44.606962 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:44.607015 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:44.607026 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:44.607036 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:44.607045 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:44.607055 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:44.607065 | orchestrator | 2026-04-07 00:24:44.607075 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2026-04-07 00:24:44.607084 | orchestrator | Tuesday 07 April 2026 00:24:42 +0000 (0:00:00.169) 0:00:10.497 ********* 2026-04-07 00:24:44.607094 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:24:44.607104 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:24:44.607114 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:24:44.607123 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:24:44.607133 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:24:44.607142 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:24:44.607152 | orchestrator | 2026-04-07 00:24:44.607161 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2026-04-07 00:24:44.607171 | orchestrator | Tuesday 07 April 2026 00:24:43 +0000 (0:00:00.531) 0:00:11.028 ********* 2026-04-07 00:24:44.607181 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:44.607190 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:44.607200 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:44.607209 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:44.607219 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:44.607228 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:44.607238 | orchestrator | 2026-04-07 00:24:44.607248 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2026-04-07 00:24:44.607257 | orchestrator | Tuesday 07 April 2026 00:24:43 +0000 (0:00:00.185) 0:00:11.214 ********* 2026-04-07 00:24:44.607267 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-04-07 00:24:44.607277 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:24:44.607286 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-07 00:24:44.607296 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-07 00:24:44.607306 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:24:44.607326 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-07 00:24:44.607335 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:24:44.607345 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:24:44.607355 | orchestrator | changed: [testbed-node-2] => (item=None) 2026-04-07 00:24:44.607364 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:24:44.607374 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-07 00:24:44.607384 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:24:44.607393 | orchestrator | 2026-04-07 00:24:44.607403 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2026-04-07 00:24:44.607413 | orchestrator | Tuesday 07 April 2026 00:24:44 +0000 (0:00:00.719) 0:00:11.933 ********* 2026-04-07 00:24:44.607422 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:44.607432 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:44.607442 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:44.607451 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:44.607461 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:44.607471 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:44.607480 | orchestrator | 2026-04-07 00:24:44.607490 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2026-04-07 00:24:44.607500 | orchestrator | Tuesday 07 April 2026 00:24:44 +0000 (0:00:00.126) 0:00:12.059 ********* 2026-04-07 00:24:44.607509 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:44.607519 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:44.607528 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:44.607538 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:44.607561 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:45.873344 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:45.873439 | orchestrator | 2026-04-07 00:24:45.873454 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2026-04-07 00:24:45.873466 | orchestrator | Tuesday 07 April 2026 00:24:44 +0000 (0:00:00.143) 0:00:12.203 ********* 2026-04-07 00:24:45.873476 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:45.873486 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:45.873495 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:45.873505 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:45.873515 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:45.873524 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:45.873534 | orchestrator | 2026-04-07 00:24:45.873544 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2026-04-07 00:24:45.873554 | orchestrator | Tuesday 07 April 2026 00:24:44 +0000 (0:00:00.139) 0:00:12.343 ********* 2026-04-07 00:24:45.873563 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:24:45.873573 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:24:45.873582 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:24:45.873592 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:24:45.873601 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:24:45.873611 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:24:45.873620 | orchestrator | 2026-04-07 00:24:45.873630 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2026-04-07 00:24:45.873640 | orchestrator | Tuesday 07 April 2026 00:24:45 +0000 (0:00:00.694) 0:00:13.038 ********* 2026-04-07 00:24:45.873650 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:24:45.873659 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:24:45.873673 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:24:45.873694 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:24:45.873717 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:24:45.873732 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:24:45.873747 | orchestrator | 2026-04-07 00:24:45.873763 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:24:45.873779 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-07 00:24:45.873963 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-07 00:24:45.873994 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-07 00:24:45.874006 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-07 00:24:45.874074 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-07 00:24:45.874088 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-07 00:24:45.874100 | orchestrator | 2026-04-07 00:24:45.874111 | orchestrator | 2026-04-07 00:24:45.874122 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:24:45.874131 | orchestrator | Tuesday 07 April 2026 00:24:45 +0000 (0:00:00.219) 0:00:13.258 ********* 2026-04-07 00:24:45.874141 | orchestrator | =============================================================================== 2026-04-07 00:24:45.874151 | orchestrator | Gathering Facts --------------------------------------------------------- 3.40s 2026-04-07 00:24:45.874174 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.34s 2026-04-07 00:24:45.874184 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.31s 2026-04-07 00:24:45.874194 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.22s 2026-04-07 00:24:45.874205 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.90s 2026-04-07 00:24:45.874215 | orchestrator | Do not require tty for all users ---------------------------------------- 0.86s 2026-04-07 00:24:45.874224 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.72s 2026-04-07 00:24:45.874234 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.69s 2026-04-07 00:24:45.874243 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.66s 2026-04-07 00:24:45.874253 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.53s 2026-04-07 00:24:45.874262 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.22s 2026-04-07 00:24:45.874272 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.19s 2026-04-07 00:24:45.874282 | orchestrator | osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file --- 0.17s 2026-04-07 00:24:45.874291 | orchestrator | osism.commons.operator : Set custom environment variables in .bashrc configuration file --- 0.17s 2026-04-07 00:24:45.874301 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.15s 2026-04-07 00:24:45.874310 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.14s 2026-04-07 00:24:45.874319 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.14s 2026-04-07 00:24:45.874329 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.14s 2026-04-07 00:24:45.874386 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.13s 2026-04-07 00:24:46.045952 | orchestrator | + osism apply --environment custom facts 2026-04-07 00:24:47.276364 | orchestrator | 2026-04-07 00:24:47 | INFO  | Trying to run play facts in environment custom 2026-04-07 00:24:57.395162 | orchestrator | 2026-04-07 00:24:57 | INFO  | Prepare task for execution of facts. 2026-04-07 00:24:57.470116 | orchestrator | 2026-04-07 00:24:57 | INFO  | Task ae9d02d4-ed00-4ed9-9605-aef327c930f9 (facts) was prepared for execution. 2026-04-07 00:24:57.470217 | orchestrator | 2026-04-07 00:24:57 | INFO  | It takes a moment until task ae9d02d4-ed00-4ed9-9605-aef327c930f9 (facts) has been started and output is visible here. 2026-04-07 00:25:39.677436 | orchestrator | 2026-04-07 00:25:39.677529 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2026-04-07 00:25:39.677541 | orchestrator | 2026-04-07 00:25:39.677549 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-04-07 00:25:39.677556 | orchestrator | Tuesday 07 April 2026 00:25:00 +0000 (0:00:00.086) 0:00:00.086 ********* 2026-04-07 00:25:39.677563 | orchestrator | ok: [testbed-manager] 2026-04-07 00:25:39.677570 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:25:39.677578 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:25:39.677585 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:25:39.677591 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:25:39.677597 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:25:39.677604 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:25:39.677610 | orchestrator | 2026-04-07 00:25:39.677616 | orchestrator | TASK [Copy fact file] ********************************************************** 2026-04-07 00:25:39.677623 | orchestrator | Tuesday 07 April 2026 00:25:01 +0000 (0:00:01.294) 0:00:01.381 ********* 2026-04-07 00:25:39.677629 | orchestrator | ok: [testbed-manager] 2026-04-07 00:25:39.677635 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:25:39.677641 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:25:39.677648 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:25:39.677654 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:25:39.677660 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:25:39.677666 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:25:39.677672 | orchestrator | 2026-04-07 00:25:39.677679 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2026-04-07 00:25:39.677685 | orchestrator | 2026-04-07 00:25:39.677691 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-04-07 00:25:39.677710 | orchestrator | Tuesday 07 April 2026 00:25:02 +0000 (0:00:01.145) 0:00:02.526 ********* 2026-04-07 00:25:39.677717 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.677723 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.677730 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.677736 | orchestrator | 2026-04-07 00:25:39.677742 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-04-07 00:25:39.677749 | orchestrator | Tuesday 07 April 2026 00:25:02 +0000 (0:00:00.071) 0:00:02.598 ********* 2026-04-07 00:25:39.677755 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.677762 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.677768 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.677774 | orchestrator | 2026-04-07 00:25:39.677780 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-04-07 00:25:39.677787 | orchestrator | Tuesday 07 April 2026 00:25:02 +0000 (0:00:00.159) 0:00:02.757 ********* 2026-04-07 00:25:39.677793 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.677800 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.677806 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.677812 | orchestrator | 2026-04-07 00:25:39.677818 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-04-07 00:25:39.677824 | orchestrator | Tuesday 07 April 2026 00:25:03 +0000 (0:00:00.179) 0:00:02.937 ********* 2026-04-07 00:25:39.677832 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:25:39.677839 | orchestrator | 2026-04-07 00:25:39.677903 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-04-07 00:25:39.677910 | orchestrator | Tuesday 07 April 2026 00:25:03 +0000 (0:00:00.103) 0:00:03.040 ********* 2026-04-07 00:25:39.677916 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.677922 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.677928 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.677934 | orchestrator | 2026-04-07 00:25:39.677941 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-04-07 00:25:39.677965 | orchestrator | Tuesday 07 April 2026 00:25:03 +0000 (0:00:00.408) 0:00:03.449 ********* 2026-04-07 00:25:39.677972 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:25:39.677978 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:25:39.677984 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:25:39.677990 | orchestrator | 2026-04-07 00:25:39.677998 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-04-07 00:25:39.678006 | orchestrator | Tuesday 07 April 2026 00:25:03 +0000 (0:00:00.098) 0:00:03.548 ********* 2026-04-07 00:25:39.678056 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:25:39.678064 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:25:39.678071 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:25:39.678079 | orchestrator | 2026-04-07 00:25:39.678086 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-04-07 00:25:39.678093 | orchestrator | Tuesday 07 April 2026 00:25:04 +0000 (0:00:00.994) 0:00:04.542 ********* 2026-04-07 00:25:39.678101 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.678108 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.678115 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.678122 | orchestrator | 2026-04-07 00:25:39.678151 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-04-07 00:25:39.678159 | orchestrator | Tuesday 07 April 2026 00:25:05 +0000 (0:00:00.462) 0:00:05.005 ********* 2026-04-07 00:25:39.678167 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:25:39.678174 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:25:39.678181 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:25:39.678189 | orchestrator | 2026-04-07 00:25:39.678196 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-04-07 00:25:39.678203 | orchestrator | Tuesday 07 April 2026 00:25:06 +0000 (0:00:01.062) 0:00:06.067 ********* 2026-04-07 00:25:39.678211 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:25:39.678218 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:25:39.678225 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:25:39.678232 | orchestrator | 2026-04-07 00:25:39.678239 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2026-04-07 00:25:39.678247 | orchestrator | Tuesday 07 April 2026 00:25:22 +0000 (0:00:16.405) 0:00:22.473 ********* 2026-04-07 00:25:39.678254 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:25:39.678261 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:25:39.678268 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:25:39.678276 | orchestrator | 2026-04-07 00:25:39.678283 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2026-04-07 00:25:39.678305 | orchestrator | Tuesday 07 April 2026 00:25:22 +0000 (0:00:00.092) 0:00:22.565 ********* 2026-04-07 00:25:39.678312 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:25:39.678320 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:25:39.678327 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:25:39.678334 | orchestrator | 2026-04-07 00:25:39.678342 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-04-07 00:25:39.678350 | orchestrator | Tuesday 07 April 2026 00:25:30 +0000 (0:00:08.052) 0:00:30.618 ********* 2026-04-07 00:25:39.678358 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.678364 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.678370 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.678376 | orchestrator | 2026-04-07 00:25:39.678382 | orchestrator | TASK [Copy fact files] ********************************************************* 2026-04-07 00:25:39.678389 | orchestrator | Tuesday 07 April 2026 00:25:31 +0000 (0:00:00.446) 0:00:31.065 ********* 2026-04-07 00:25:39.678395 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2026-04-07 00:25:39.678401 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2026-04-07 00:25:39.678407 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2026-04-07 00:25:39.678413 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2026-04-07 00:25:39.678420 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2026-04-07 00:25:39.678433 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2026-04-07 00:25:39.678439 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2026-04-07 00:25:39.678450 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2026-04-07 00:25:39.678457 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2026-04-07 00:25:39.678463 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2026-04-07 00:25:39.678469 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2026-04-07 00:25:39.678475 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2026-04-07 00:25:39.678481 | orchestrator | 2026-04-07 00:25:39.678488 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-04-07 00:25:39.678494 | orchestrator | Tuesday 07 April 2026 00:25:34 +0000 (0:00:03.572) 0:00:34.637 ********* 2026-04-07 00:25:39.678500 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.678507 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.678513 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.678519 | orchestrator | 2026-04-07 00:25:39.678525 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-07 00:25:39.678532 | orchestrator | 2026-04-07 00:25:39.678538 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-07 00:25:39.678545 | orchestrator | Tuesday 07 April 2026 00:25:36 +0000 (0:00:01.299) 0:00:35.937 ********* 2026-04-07 00:25:39.678551 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:25:39.678557 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:25:39.678563 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:25:39.678569 | orchestrator | ok: [testbed-manager] 2026-04-07 00:25:39.678576 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:25:39.678582 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:25:39.678588 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:25:39.678594 | orchestrator | 2026-04-07 00:25:39.678601 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:25:39.678608 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:25:39.678614 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:25:39.678622 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:25:39.678629 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:25:39.678635 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:25:39.678642 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:25:39.678648 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:25:39.678654 | orchestrator | 2026-04-07 00:25:39.678660 | orchestrator | 2026-04-07 00:25:39.678667 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:25:39.678673 | orchestrator | Tuesday 07 April 2026 00:25:39 +0000 (0:00:03.650) 0:00:39.588 ********* 2026-04-07 00:25:39.678679 | orchestrator | =============================================================================== 2026-04-07 00:25:39.678685 | orchestrator | osism.commons.repository : Update package cache ------------------------ 16.41s 2026-04-07 00:25:39.678692 | orchestrator | Install required packages (Debian) -------------------------------------- 8.05s 2026-04-07 00:25:39.678698 | orchestrator | Gathers facts about hosts ----------------------------------------------- 3.65s 2026-04-07 00:25:39.678709 | orchestrator | Copy fact files --------------------------------------------------------- 3.57s 2026-04-07 00:25:39.678715 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.30s 2026-04-07 00:25:39.678721 | orchestrator | Create custom facts directory ------------------------------------------- 1.29s 2026-04-07 00:25:39.678731 | orchestrator | Copy fact file ---------------------------------------------------------- 1.15s 2026-04-07 00:25:39.855474 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 1.06s 2026-04-07 00:25:39.855601 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 0.99s 2026-04-07 00:25:39.855624 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.46s 2026-04-07 00:25:39.855642 | orchestrator | Create custom facts directory ------------------------------------------- 0.45s 2026-04-07 00:25:39.855660 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.41s 2026-04-07 00:25:39.855679 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.18s 2026-04-07 00:25:39.855698 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.16s 2026-04-07 00:25:39.855717 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.10s 2026-04-07 00:25:39.855729 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.10s 2026-04-07 00:25:39.855740 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.09s 2026-04-07 00:25:39.855751 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.07s 2026-04-07 00:25:40.027743 | orchestrator | + osism apply bootstrap 2026-04-07 00:25:51.356744 | orchestrator | 2026-04-07 00:25:51 | INFO  | Prepare task for execution of bootstrap. 2026-04-07 00:25:51.431817 | orchestrator | 2026-04-07 00:25:51 | INFO  | Task 46e8d02d-e312-4ee3-b372-e40f01163817 (bootstrap) was prepared for execution. 2026-04-07 00:25:51.431997 | orchestrator | 2026-04-07 00:25:51 | INFO  | It takes a moment until task 46e8d02d-e312-4ee3-b372-e40f01163817 (bootstrap) has been started and output is visible here. 2026-04-07 00:26:06.523972 | orchestrator | 2026-04-07 00:26:06.524068 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2026-04-07 00:26:06.524084 | orchestrator | 2026-04-07 00:26:06.524094 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2026-04-07 00:26:06.524103 | orchestrator | Tuesday 07 April 2026 00:25:54 +0000 (0:00:00.186) 0:00:00.186 ********* 2026-04-07 00:26:06.524112 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:06.524123 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:06.524132 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:06.524141 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:06.524149 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:06.524158 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:06.524166 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:06.524173 | orchestrator | 2026-04-07 00:26:06.524181 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-07 00:26:06.524188 | orchestrator | 2026-04-07 00:26:06.524196 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-07 00:26:06.524206 | orchestrator | Tuesday 07 April 2026 00:25:54 +0000 (0:00:00.286) 0:00:00.472 ********* 2026-04-07 00:26:06.524214 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:06.524222 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:06.524230 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:06.524238 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:06.524246 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:06.524255 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:06.524263 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:06.524271 | orchestrator | 2026-04-07 00:26:06.524279 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2026-04-07 00:26:06.524287 | orchestrator | 2026-04-07 00:26:06.524316 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-07 00:26:06.524324 | orchestrator | Tuesday 07 April 2026 00:25:59 +0000 (0:00:04.698) 0:00:05.171 ********* 2026-04-07 00:26:06.524333 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2026-04-07 00:26:06.524342 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-04-07 00:26:06.524350 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2026-04-07 00:26:06.524358 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2026-04-07 00:26:06.524366 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2026-04-07 00:26:06.524374 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:26:06.524382 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2026-04-07 00:26:06.524390 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:26:06.524398 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-04-07 00:26:06.524407 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2026-04-07 00:26:06.524415 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-04-07 00:26:06.524423 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:26:06.524432 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2026-04-07 00:26:06.524440 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-04-07 00:26:06.524448 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2026-04-07 00:26:06.524456 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2026-04-07 00:26:06.524464 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2026-04-07 00:26:06.524472 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-04-07 00:26:06.524480 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2026-04-07 00:26:06.524489 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2026-04-07 00:26:06.524496 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2026-04-07 00:26:06.524504 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2026-04-07 00:26:06.524512 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:06.524520 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-04-07 00:26:06.524528 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-07 00:26:06.524536 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2026-04-07 00:26:06.524544 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:06.524552 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2026-04-07 00:26:06.524561 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:06.524569 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-07 00:26:06.524577 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2026-04-07 00:26:06.524586 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-04-07 00:26:06.524594 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-07 00:26:06.524602 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-04-07 00:26:06.524610 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2026-04-07 00:26:06.524618 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2026-04-07 00:26:06.524626 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:26:06.524633 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-04-07 00:26:06.524641 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:26:06.524649 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-04-07 00:26:06.524657 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2026-04-07 00:26:06.524664 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-04-07 00:26:06.524684 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-04-07 00:26:06.524702 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:26:06.524711 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2026-04-07 00:26:06.524719 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:06.524727 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:06.524750 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2026-04-07 00:26:06.524758 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-04-07 00:26:06.524766 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2026-04-07 00:26:06.524775 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2026-04-07 00:26:06.524783 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2026-04-07 00:26:06.524792 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:06.524799 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2026-04-07 00:26:06.524807 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2026-04-07 00:26:06.524815 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:06.524823 | orchestrator | 2026-04-07 00:26:06.524830 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2026-04-07 00:26:06.524837 | orchestrator | 2026-04-07 00:26:06.524844 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2026-04-07 00:26:06.524872 | orchestrator | Tuesday 07 April 2026 00:26:00 +0000 (0:00:00.449) 0:00:05.620 ********* 2026-04-07 00:26:06.524880 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:06.524887 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:06.524895 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:06.524902 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:06.524909 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:06.524917 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:06.524924 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:06.524931 | orchestrator | 2026-04-07 00:26:06.524940 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2026-04-07 00:26:06.524946 | orchestrator | Tuesday 07 April 2026 00:26:01 +0000 (0:00:01.264) 0:00:06.885 ********* 2026-04-07 00:26:06.524951 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:06.524956 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:06.524960 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:06.524965 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:06.524969 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:06.524974 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:06.524978 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:06.524983 | orchestrator | 2026-04-07 00:26:06.524988 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2026-04-07 00:26:06.524992 | orchestrator | Tuesday 07 April 2026 00:26:02 +0000 (0:00:01.270) 0:00:08.156 ********* 2026-04-07 00:26:06.524998 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:06.525005 | orchestrator | 2026-04-07 00:26:06.525009 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2026-04-07 00:26:06.525014 | orchestrator | Tuesday 07 April 2026 00:26:02 +0000 (0:00:00.242) 0:00:08.398 ********* 2026-04-07 00:26:06.525019 | orchestrator | changed: [testbed-manager] 2026-04-07 00:26:06.525023 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:06.525028 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:06.525032 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:06.525037 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:06.525041 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:06.525046 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:06.525050 | orchestrator | 2026-04-07 00:26:06.525055 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2026-04-07 00:26:06.525060 | orchestrator | Tuesday 07 April 2026 00:26:04 +0000 (0:00:01.402) 0:00:09.801 ********* 2026-04-07 00:26:06.525069 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:06.525075 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:06.525082 | orchestrator | 2026-04-07 00:26:06.525087 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2026-04-07 00:26:06.525092 | orchestrator | Tuesday 07 April 2026 00:26:04 +0000 (0:00:00.217) 0:00:10.018 ********* 2026-04-07 00:26:06.525096 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:06.525101 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:06.525105 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:06.525110 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:06.525114 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:06.525119 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:06.525123 | orchestrator | 2026-04-07 00:26:06.525128 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2026-04-07 00:26:06.525133 | orchestrator | Tuesday 07 April 2026 00:26:05 +0000 (0:00:01.012) 0:00:11.030 ********* 2026-04-07 00:26:06.525137 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:06.525142 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:06.525146 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:06.525151 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:06.525156 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:06.525160 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:06.525165 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:06.525169 | orchestrator | 2026-04-07 00:26:06.525174 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2026-04-07 00:26:06.525178 | orchestrator | Tuesday 07 April 2026 00:26:06 +0000 (0:00:00.532) 0:00:11.563 ********* 2026-04-07 00:26:06.525183 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:06.525188 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:06.525192 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:06.525197 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:06.525201 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:06.525206 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:06.525210 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:06.525215 | orchestrator | 2026-04-07 00:26:06.525220 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2026-04-07 00:26:06.525226 | orchestrator | Tuesday 07 April 2026 00:26:06 +0000 (0:00:00.393) 0:00:11.956 ********* 2026-04-07 00:26:06.525231 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:06.525235 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:06.525246 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:18.009605 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:18.009760 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:18.009788 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:18.009807 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:18.009828 | orchestrator | 2026-04-07 00:26:18.009841 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2026-04-07 00:26:18.009881 | orchestrator | Tuesday 07 April 2026 00:26:06 +0000 (0:00:00.194) 0:00:12.151 ********* 2026-04-07 00:26:18.009896 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:18.009927 | orchestrator | 2026-04-07 00:26:18.009939 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2026-04-07 00:26:18.009951 | orchestrator | Tuesday 07 April 2026 00:26:06 +0000 (0:00:00.275) 0:00:12.426 ********* 2026-04-07 00:26:18.009962 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:18.010002 | orchestrator | 2026-04-07 00:26:18.010084 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2026-04-07 00:26:18.010100 | orchestrator | Tuesday 07 April 2026 00:26:07 +0000 (0:00:00.280) 0:00:12.707 ********* 2026-04-07 00:26:18.010114 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.010127 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.010140 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.010154 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.010166 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.010179 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.010192 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.010204 | orchestrator | 2026-04-07 00:26:18.010217 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2026-04-07 00:26:18.010230 | orchestrator | Tuesday 07 April 2026 00:26:08 +0000 (0:00:01.244) 0:00:13.952 ********* 2026-04-07 00:26:18.010243 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:18.010256 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:18.010268 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:18.010281 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:18.010294 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:18.010307 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:18.010319 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:18.010332 | orchestrator | 2026-04-07 00:26:18.010345 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2026-04-07 00:26:18.010358 | orchestrator | Tuesday 07 April 2026 00:26:08 +0000 (0:00:00.203) 0:00:14.155 ********* 2026-04-07 00:26:18.010370 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.010383 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.010396 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.010409 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.010420 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.010430 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.010441 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.010452 | orchestrator | 2026-04-07 00:26:18.010463 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2026-04-07 00:26:18.010474 | orchestrator | Tuesday 07 April 2026 00:26:09 +0000 (0:00:00.515) 0:00:14.671 ********* 2026-04-07 00:26:18.010485 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:18.010496 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:18.010506 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:18.010517 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:18.010528 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:18.010538 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:18.010549 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:18.010559 | orchestrator | 2026-04-07 00:26:18.010570 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2026-04-07 00:26:18.010582 | orchestrator | Tuesday 07 April 2026 00:26:09 +0000 (0:00:00.226) 0:00:14.898 ********* 2026-04-07 00:26:18.010593 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.010604 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:18.010614 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:18.010625 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:18.010636 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:18.010646 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:18.010656 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:18.010667 | orchestrator | 2026-04-07 00:26:18.010690 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2026-04-07 00:26:18.010702 | orchestrator | Tuesday 07 April 2026 00:26:09 +0000 (0:00:00.559) 0:00:15.457 ********* 2026-04-07 00:26:18.010712 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.010723 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:18.010734 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:18.010753 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:18.010764 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:18.010775 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:18.010786 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:18.010796 | orchestrator | 2026-04-07 00:26:18.010808 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2026-04-07 00:26:18.010819 | orchestrator | Tuesday 07 April 2026 00:26:11 +0000 (0:00:01.093) 0:00:16.550 ********* 2026-04-07 00:26:18.010829 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.010840 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.010870 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.010882 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.010893 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.010908 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.010920 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.010930 | orchestrator | 2026-04-07 00:26:18.010941 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2026-04-07 00:26:18.010953 | orchestrator | Tuesday 07 April 2026 00:26:11 +0000 (0:00:00.974) 0:00:17.524 ********* 2026-04-07 00:26:18.010985 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:18.010997 | orchestrator | 2026-04-07 00:26:18.011008 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2026-04-07 00:26:18.011019 | orchestrator | Tuesday 07 April 2026 00:26:12 +0000 (0:00:00.308) 0:00:17.833 ********* 2026-04-07 00:26:18.011030 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:18.011041 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:18.011052 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:18.011062 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:18.011073 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:18.011084 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:18.011094 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:18.011105 | orchestrator | 2026-04-07 00:26:18.011116 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-04-07 00:26:18.011127 | orchestrator | Tuesday 07 April 2026 00:26:13 +0000 (0:00:01.263) 0:00:19.096 ********* 2026-04-07 00:26:18.011138 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011149 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.011160 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.011171 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.011182 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011193 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.011203 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.011214 | orchestrator | 2026-04-07 00:26:18.011225 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-04-07 00:26:18.011236 | orchestrator | Tuesday 07 April 2026 00:26:13 +0000 (0:00:00.197) 0:00:19.294 ********* 2026-04-07 00:26:18.011246 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011257 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.011268 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.011279 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.011290 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011300 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.011311 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.011322 | orchestrator | 2026-04-07 00:26:18.011333 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-04-07 00:26:18.011344 | orchestrator | Tuesday 07 April 2026 00:26:13 +0000 (0:00:00.191) 0:00:19.486 ********* 2026-04-07 00:26:18.011354 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011365 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.011376 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.011387 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.011397 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011415 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.011426 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.011437 | orchestrator | 2026-04-07 00:26:18.011448 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-04-07 00:26:18.011458 | orchestrator | Tuesday 07 April 2026 00:26:14 +0000 (0:00:00.238) 0:00:19.725 ********* 2026-04-07 00:26:18.011470 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:18.011483 | orchestrator | 2026-04-07 00:26:18.011494 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-04-07 00:26:18.011505 | orchestrator | Tuesday 07 April 2026 00:26:14 +0000 (0:00:00.285) 0:00:20.011 ********* 2026-04-07 00:26:18.011516 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011527 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.011538 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.011549 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.011559 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011570 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.011581 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.011592 | orchestrator | 2026-04-07 00:26:18.011602 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-04-07 00:26:18.011613 | orchestrator | Tuesday 07 April 2026 00:26:15 +0000 (0:00:00.603) 0:00:20.614 ********* 2026-04-07 00:26:18.011624 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:18.011635 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:18.011646 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:18.011657 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:18.011668 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:18.011679 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:18.011689 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:18.011700 | orchestrator | 2026-04-07 00:26:18.011711 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-04-07 00:26:18.011722 | orchestrator | Tuesday 07 April 2026 00:26:15 +0000 (0:00:00.214) 0:00:20.829 ********* 2026-04-07 00:26:18.011733 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011744 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:18.011754 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:18.011765 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011776 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:18.011787 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.011798 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.011809 | orchestrator | 2026-04-07 00:26:18.011819 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-04-07 00:26:18.011830 | orchestrator | Tuesday 07 April 2026 00:26:16 +0000 (0:00:01.123) 0:00:21.952 ********* 2026-04-07 00:26:18.011841 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011870 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:18.011881 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:18.011892 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011903 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:18.011914 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:18.011925 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.011935 | orchestrator | 2026-04-07 00:26:18.011952 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-04-07 00:26:18.011963 | orchestrator | Tuesday 07 April 2026 00:26:16 +0000 (0:00:00.576) 0:00:22.529 ********* 2026-04-07 00:26:18.011974 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:18.011985 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:18.011997 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:18.012007 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:18.012025 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:59.279760 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.279928 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:59.279971 | orchestrator | 2026-04-07 00:26:59.279986 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-04-07 00:26:59.279999 | orchestrator | Tuesday 07 April 2026 00:26:18 +0000 (0:00:01.043) 0:00:23.572 ********* 2026-04-07 00:26:59.280010 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.280021 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.280032 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.280043 | orchestrator | changed: [testbed-manager] 2026-04-07 00:26:59.280054 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:59.280065 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:59.280075 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:59.280086 | orchestrator | 2026-04-07 00:26:59.280097 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2026-04-07 00:26:59.280108 | orchestrator | Tuesday 07 April 2026 00:26:36 +0000 (0:00:18.358) 0:00:41.931 ********* 2026-04-07 00:26:59.280119 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.280129 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.280140 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.280151 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.280161 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.280172 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.280182 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.280193 | orchestrator | 2026-04-07 00:26:59.280204 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2026-04-07 00:26:59.280215 | orchestrator | Tuesday 07 April 2026 00:26:36 +0000 (0:00:00.208) 0:00:42.139 ********* 2026-04-07 00:26:59.280226 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.280236 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.280247 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.280257 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.280268 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.280279 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.280289 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.280301 | orchestrator | 2026-04-07 00:26:59.280319 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2026-04-07 00:26:59.280342 | orchestrator | Tuesday 07 April 2026 00:26:36 +0000 (0:00:00.199) 0:00:42.338 ********* 2026-04-07 00:26:59.280368 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.280386 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.280404 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.280422 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.280439 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.280459 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.280477 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.280493 | orchestrator | 2026-04-07 00:26:59.280504 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2026-04-07 00:26:59.280515 | orchestrator | Tuesday 07 April 2026 00:26:37 +0000 (0:00:00.198) 0:00:42.536 ********* 2026-04-07 00:26:59.280528 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:59.280542 | orchestrator | 2026-04-07 00:26:59.280554 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2026-04-07 00:26:59.280565 | orchestrator | Tuesday 07 April 2026 00:26:37 +0000 (0:00:00.252) 0:00:42.789 ********* 2026-04-07 00:26:59.280575 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.280586 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.280597 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.280608 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.280618 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.280629 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.280640 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.280651 | orchestrator | 2026-04-07 00:26:59.280662 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2026-04-07 00:26:59.280683 | orchestrator | Tuesday 07 April 2026 00:26:39 +0000 (0:00:01.844) 0:00:44.633 ********* 2026-04-07 00:26:59.280694 | orchestrator | changed: [testbed-manager] 2026-04-07 00:26:59.280705 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:59.280716 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:59.280727 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:59.280738 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:59.280749 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:59.280759 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:59.280770 | orchestrator | 2026-04-07 00:26:59.280781 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2026-04-07 00:26:59.280792 | orchestrator | Tuesday 07 April 2026 00:26:40 +0000 (0:00:01.100) 0:00:45.734 ********* 2026-04-07 00:26:59.280803 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.280814 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.280825 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.280836 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.280846 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.280891 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.280902 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.280913 | orchestrator | 2026-04-07 00:26:59.280924 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2026-04-07 00:26:59.280935 | orchestrator | Tuesday 07 April 2026 00:26:40 +0000 (0:00:00.796) 0:00:46.530 ********* 2026-04-07 00:26:59.280947 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:59.280960 | orchestrator | 2026-04-07 00:26:59.280971 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2026-04-07 00:26:59.280983 | orchestrator | Tuesday 07 April 2026 00:26:41 +0000 (0:00:00.261) 0:00:46.792 ********* 2026-04-07 00:26:59.280994 | orchestrator | changed: [testbed-manager] 2026-04-07 00:26:59.281005 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:59.281015 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:59.281026 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:59.281037 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:59.281048 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:59.281059 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:59.281070 | orchestrator | 2026-04-07 00:26:59.281103 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2026-04-07 00:26:59.281115 | orchestrator | Tuesday 07 April 2026 00:26:42 +0000 (0:00:01.118) 0:00:47.911 ********* 2026-04-07 00:26:59.281126 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:26:59.281136 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:26:59.281147 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:26:59.281158 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:26:59.281169 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:26:59.281179 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:26:59.281190 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:26:59.281201 | orchestrator | 2026-04-07 00:26:59.281212 | orchestrator | TASK [osism.services.rsyslog : Include logrotate tasks] ************************ 2026-04-07 00:26:59.281223 | orchestrator | Tuesday 07 April 2026 00:26:42 +0000 (0:00:00.226) 0:00:48.137 ********* 2026-04-07 00:26:59.281234 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/logrotate.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:59.281245 | orchestrator | 2026-04-07 00:26:59.281256 | orchestrator | TASK [osism.services.rsyslog : Ensure logrotate package is installed] ********** 2026-04-07 00:26:59.281267 | orchestrator | Tuesday 07 April 2026 00:26:42 +0000 (0:00:00.274) 0:00:48.411 ********* 2026-04-07 00:26:59.281278 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.281289 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.281313 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.281332 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.281349 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.281368 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.281387 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.281406 | orchestrator | 2026-04-07 00:26:59.281424 | orchestrator | TASK [osism.services.rsyslog : Configure logrotate for rsyslog] **************** 2026-04-07 00:26:59.281439 | orchestrator | Tuesday 07 April 2026 00:26:44 +0000 (0:00:01.974) 0:00:50.386 ********* 2026-04-07 00:26:59.281450 | orchestrator | changed: [testbed-manager] 2026-04-07 00:26:59.281461 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:59.281472 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:59.281483 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:59.281493 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:59.281504 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:59.281515 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:59.281525 | orchestrator | 2026-04-07 00:26:59.281536 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2026-04-07 00:26:59.281547 | orchestrator | Tuesday 07 April 2026 00:26:46 +0000 (0:00:01.207) 0:00:51.594 ********* 2026-04-07 00:26:59.281558 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:26:59.281568 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:26:59.281579 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:26:59.281590 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:26:59.281600 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:26:59.281611 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:26:59.281622 | orchestrator | changed: [testbed-manager] 2026-04-07 00:26:59.281632 | orchestrator | 2026-04-07 00:26:59.281643 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2026-04-07 00:26:59.281654 | orchestrator | Tuesday 07 April 2026 00:26:56 +0000 (0:00:10.831) 0:01:02.425 ********* 2026-04-07 00:26:59.281665 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.281675 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.281686 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.281697 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.281707 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.281724 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.281750 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.281770 | orchestrator | 2026-04-07 00:26:59.281790 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2026-04-07 00:26:59.281809 | orchestrator | Tuesday 07 April 2026 00:26:57 +0000 (0:00:00.779) 0:01:03.205 ********* 2026-04-07 00:26:59.281827 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.281840 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.281882 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.281897 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.281908 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.281919 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.281929 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.281940 | orchestrator | 2026-04-07 00:26:59.281951 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2026-04-07 00:26:59.281961 | orchestrator | Tuesday 07 April 2026 00:26:58 +0000 (0:00:00.895) 0:01:04.100 ********* 2026-04-07 00:26:59.281972 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.281983 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.281993 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.282004 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.282015 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.282111 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.282122 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.282133 | orchestrator | 2026-04-07 00:26:59.282144 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2026-04-07 00:26:59.282155 | orchestrator | Tuesday 07 April 2026 00:26:58 +0000 (0:00:00.207) 0:01:04.307 ********* 2026-04-07 00:26:59.282166 | orchestrator | ok: [testbed-manager] 2026-04-07 00:26:59.282186 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:26:59.282198 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:26:59.282208 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:26:59.282219 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:26:59.282241 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:26:59.282253 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:26:59.282264 | orchestrator | 2026-04-07 00:26:59.282275 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2026-04-07 00:26:59.282290 | orchestrator | Tuesday 07 April 2026 00:26:58 +0000 (0:00:00.197) 0:01:04.505 ********* 2026-04-07 00:26:59.282304 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:26:59.282325 | orchestrator | 2026-04-07 00:26:59.282363 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2026-04-07 00:29:18.728755 | orchestrator | Tuesday 07 April 2026 00:26:59 +0000 (0:00:00.299) 0:01:04.804 ********* 2026-04-07 00:29:18.728906 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:18.728993 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.729009 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.729020 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.729031 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.729042 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.729062 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.729082 | orchestrator | 2026-04-07 00:29:18.729102 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2026-04-07 00:29:18.729121 | orchestrator | Tuesday 07 April 2026 00:27:01 +0000 (0:00:01.910) 0:01:06.715 ********* 2026-04-07 00:29:18.729140 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:18.729159 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:18.729179 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:18.729197 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:18.729217 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:18.729234 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:18.729251 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:18.729270 | orchestrator | 2026-04-07 00:29:18.729290 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2026-04-07 00:29:18.729311 | orchestrator | Tuesday 07 April 2026 00:27:01 +0000 (0:00:00.525) 0:01:07.241 ********* 2026-04-07 00:29:18.729330 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:18.729350 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.729371 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.729391 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.729412 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.729433 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.729454 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.729475 | orchestrator | 2026-04-07 00:29:18.729496 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2026-04-07 00:29:18.729516 | orchestrator | Tuesday 07 April 2026 00:27:01 +0000 (0:00:00.212) 0:01:07.453 ********* 2026-04-07 00:29:18.729537 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:18.729558 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.729576 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.729589 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.729602 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.729615 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.729626 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.729637 | orchestrator | 2026-04-07 00:29:18.729649 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2026-04-07 00:29:18.729660 | orchestrator | Tuesday 07 April 2026 00:27:03 +0000 (0:00:01.434) 0:01:08.888 ********* 2026-04-07 00:29:18.729671 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:18.729682 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:18.729697 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:18.729737 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:18.729749 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:18.729760 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:18.729771 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:18.729782 | orchestrator | 2026-04-07 00:29:18.729793 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2026-04-07 00:29:18.729804 | orchestrator | Tuesday 07 April 2026 00:27:05 +0000 (0:00:01.994) 0:01:10.883 ********* 2026-04-07 00:29:18.729815 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:18.729826 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.729837 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.729848 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.729858 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.729869 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.729880 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.729891 | orchestrator | 2026-04-07 00:29:18.729902 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2026-04-07 00:29:18.729913 | orchestrator | Tuesday 07 April 2026 00:27:07 +0000 (0:00:02.604) 0:01:13.487 ********* 2026-04-07 00:29:18.729924 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:18.729964 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.729976 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.729987 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.729998 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.730009 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.730101 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.730165 | orchestrator | 2026-04-07 00:29:18.730189 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2026-04-07 00:29:18.730208 | orchestrator | Tuesday 07 April 2026 00:27:47 +0000 (0:00:39.220) 0:01:52.707 ********* 2026-04-07 00:29:18.730227 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:18.730244 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:18.730255 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:18.730266 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:18.730277 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:18.730288 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:18.730299 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:18.730309 | orchestrator | 2026-04-07 00:29:18.730320 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2026-04-07 00:29:18.730331 | orchestrator | Tuesday 07 April 2026 00:29:04 +0000 (0:01:17.431) 0:03:10.139 ********* 2026-04-07 00:29:18.730342 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:18.730353 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.730364 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.730374 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.730385 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.730396 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.730407 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.730417 | orchestrator | 2026-04-07 00:29:18.730428 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2026-04-07 00:29:18.730440 | orchestrator | Tuesday 07 April 2026 00:29:06 +0000 (0:00:02.149) 0:03:12.288 ********* 2026-04-07 00:29:18.730451 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:18.730461 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:18.730488 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:18.730499 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:18.730509 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:18.730520 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:18.730530 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:18.730541 | orchestrator | 2026-04-07 00:29:18.730552 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2026-04-07 00:29:18.730563 | orchestrator | Tuesday 07 April 2026 00:29:17 +0000 (0:00:10.826) 0:03:23.115 ********* 2026-04-07 00:29:18.730605 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2026-04-07 00:29:18.730645 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2026-04-07 00:29:18.730670 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2026-04-07 00:29:18.730697 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2026-04-07 00:29:18.730716 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'network', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2026-04-07 00:29:18.730734 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2026-04-07 00:29:18.730754 | orchestrator | 2026-04-07 00:29:18.730772 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2026-04-07 00:29:18.730791 | orchestrator | Tuesday 07 April 2026 00:29:17 +0000 (0:00:00.372) 0:03:23.487 ********* 2026-04-07 00:29:18.730811 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-07 00:29:18.730829 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:18.730848 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-07 00:29:18.730860 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:29:18.730871 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-07 00:29:18.730882 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:29:18.730893 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-07 00:29:18.730903 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:29:18.730914 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-07 00:29:18.730925 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-07 00:29:18.730963 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-07 00:29:18.730984 | orchestrator | 2026-04-07 00:29:18.731002 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2026-04-07 00:29:18.731015 | orchestrator | Tuesday 07 April 2026 00:29:18 +0000 (0:00:00.707) 0:03:24.195 ********* 2026-04-07 00:29:18.731026 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-07 00:29:18.731048 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-07 00:29:18.731065 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-07 00:29:18.731076 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-07 00:29:18.731087 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-07 00:29:18.731108 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-07 00:29:28.791422 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-07 00:29:28.791520 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-07 00:29:28.791531 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-07 00:29:28.791539 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-07 00:29:28.791548 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:28.791558 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-07 00:29:28.791565 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-07 00:29:28.791573 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-07 00:29:28.791580 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-07 00:29:28.791587 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-07 00:29:28.791595 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-07 00:29:28.791602 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-07 00:29:28.791610 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-07 00:29:28.791617 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-07 00:29:28.791624 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-07 00:29:28.791632 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-07 00:29:28.791639 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:29:28.791646 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-07 00:29:28.791654 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-07 00:29:28.791661 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-07 00:29:28.791668 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-07 00:29:28.791675 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-07 00:29:28.791682 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-07 00:29:28.791690 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-07 00:29:28.791697 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-07 00:29:28.791704 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-07 00:29:28.791715 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-07 00:29:28.791755 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:29:28.791771 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-07 00:29:28.791785 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-07 00:29:28.791798 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-07 00:29:28.791810 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-07 00:29:28.791821 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-07 00:29:28.791834 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-07 00:29:28.791846 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-07 00:29:28.791859 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-07 00:29:28.791872 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-07 00:29:28.791885 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:29:28.791898 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-04-07 00:29:28.791911 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-04-07 00:29:28.791924 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-04-07 00:29:28.791964 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-04-07 00:29:28.791980 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-04-07 00:29:28.792012 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-04-07 00:29:28.792026 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-04-07 00:29:28.792039 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-04-07 00:29:28.792053 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-04-07 00:29:28.792065 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-04-07 00:29:28.792078 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-04-07 00:29:28.792095 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-04-07 00:29:28.792112 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-04-07 00:29:28.792124 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-04-07 00:29:28.792135 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-04-07 00:29:28.792148 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-04-07 00:29:28.792162 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-04-07 00:29:28.792174 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-04-07 00:29:28.792196 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-04-07 00:29:28.792204 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-04-07 00:29:28.792227 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-04-07 00:29:28.792235 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-04-07 00:29:28.792242 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-04-07 00:29:28.792259 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-04-07 00:29:28.792266 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-04-07 00:29:28.792273 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-04-07 00:29:28.792281 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-04-07 00:29:28.792288 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-04-07 00:29:28.792295 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-04-07 00:29:28.792302 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-04-07 00:29:28.792310 | orchestrator | 2026-04-07 00:29:28.792320 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2026-04-07 00:29:28.792332 | orchestrator | Tuesday 07 April 2026 00:29:25 +0000 (0:00:07.008) 0:03:31.203 ********* 2026-04-07 00:29:28.792342 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792353 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792364 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792375 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792385 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792397 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792408 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-07 00:29:28.792419 | orchestrator | 2026-04-07 00:29:28.792432 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2026-04-07 00:29:28.792444 | orchestrator | Tuesday 07 April 2026 00:29:27 +0000 (0:00:01.543) 0:03:32.747 ********* 2026-04-07 00:29:28.792456 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:28.792468 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:28.792476 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:28.792483 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:28.792491 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:29:28.792509 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:28.792521 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:29:28.792533 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:29:28.792546 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-07 00:29:28.792560 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-07 00:29:28.792584 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-07 00:29:42.351339 | orchestrator | 2026-04-07 00:29:42.351476 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on network] ***************** 2026-04-07 00:29:42.351500 | orchestrator | Tuesday 07 April 2026 00:29:28 +0000 (0:00:01.603) 0:03:34.351 ********* 2026-04-07 00:29:42.351519 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:42.351541 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:42.351563 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:42.351614 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:42.351634 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:29:42.351653 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:29:42.351673 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-07 00:29:42.351692 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:29:42.351712 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-07 00:29:42.351732 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-07 00:29:42.351751 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-07 00:29:42.351770 | orchestrator | 2026-04-07 00:29:42.351791 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2026-04-07 00:29:42.351810 | orchestrator | Tuesday 07 April 2026 00:29:29 +0000 (0:00:00.488) 0:03:34.840 ********* 2026-04-07 00:29:42.351830 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-07 00:29:42.351851 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:42.351872 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-07 00:29:42.351894 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:29:42.351915 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-07 00:29:42.351936 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-07 00:29:42.352038 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:29:42.352060 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:29:42.352080 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-04-07 00:29:42.352099 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-04-07 00:29:42.352118 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-04-07 00:29:42.352137 | orchestrator | 2026-04-07 00:29:42.352157 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2026-04-07 00:29:42.352176 | orchestrator | Tuesday 07 April 2026 00:29:30 +0000 (0:00:01.645) 0:03:36.486 ********* 2026-04-07 00:29:42.352196 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:42.352213 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:29:42.352231 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:29:42.352248 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:29:42.352266 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:29:42.352283 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:29:42.352301 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:29:42.352317 | orchestrator | 2026-04-07 00:29:42.352336 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2026-04-07 00:29:42.352354 | orchestrator | Tuesday 07 April 2026 00:29:31 +0000 (0:00:00.228) 0:03:36.714 ********* 2026-04-07 00:29:42.352370 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:42.352387 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:42.352402 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:42.352418 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:42.352434 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:42.352450 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:42.352467 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:42.352483 | orchestrator | 2026-04-07 00:29:42.352497 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2026-04-07 00:29:42.352512 | orchestrator | Tuesday 07 April 2026 00:29:36 +0000 (0:00:05.382) 0:03:42.097 ********* 2026-04-07 00:29:42.352527 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2026-04-07 00:29:42.352558 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2026-04-07 00:29:42.352574 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:42.352589 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2026-04-07 00:29:42.352605 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:29:42.352620 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2026-04-07 00:29:42.352634 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:29:42.352650 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2026-04-07 00:29:42.352666 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:29:42.352681 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2026-04-07 00:29:42.352716 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:29:42.352733 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:29:42.352750 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2026-04-07 00:29:42.352765 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:29:42.352781 | orchestrator | 2026-04-07 00:29:42.352796 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2026-04-07 00:29:42.352811 | orchestrator | Tuesday 07 April 2026 00:29:36 +0000 (0:00:00.306) 0:03:42.403 ********* 2026-04-07 00:29:42.352826 | orchestrator | ok: [testbed-manager] => (item=cron) 2026-04-07 00:29:42.352841 | orchestrator | ok: [testbed-node-1] => (item=cron) 2026-04-07 00:29:42.352857 | orchestrator | ok: [testbed-node-0] => (item=cron) 2026-04-07 00:29:42.352899 | orchestrator | ok: [testbed-node-2] => (item=cron) 2026-04-07 00:29:42.352916 | orchestrator | ok: [testbed-node-3] => (item=cron) 2026-04-07 00:29:42.352933 | orchestrator | ok: [testbed-node-4] => (item=cron) 2026-04-07 00:29:42.353162 | orchestrator | ok: [testbed-node-5] => (item=cron) 2026-04-07 00:29:42.353188 | orchestrator | 2026-04-07 00:29:42.353204 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2026-04-07 00:29:42.353221 | orchestrator | Tuesday 07 April 2026 00:29:37 +0000 (0:00:01.051) 0:03:43.455 ********* 2026-04-07 00:29:42.353240 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:29:42.353259 | orchestrator | 2026-04-07 00:29:42.353276 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2026-04-07 00:29:42.353292 | orchestrator | Tuesday 07 April 2026 00:29:38 +0000 (0:00:00.446) 0:03:43.901 ********* 2026-04-07 00:29:42.353309 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:42.353324 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:42.353340 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:42.353355 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:42.353458 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:42.353469 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:42.353479 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:42.353488 | orchestrator | 2026-04-07 00:29:42.353498 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2026-04-07 00:29:42.353508 | orchestrator | Tuesday 07 April 2026 00:29:39 +0000 (0:00:01.362) 0:03:45.263 ********* 2026-04-07 00:29:42.353518 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:42.353528 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:42.353537 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:42.353547 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:42.353556 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:42.353566 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:42.353575 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:42.353585 | orchestrator | 2026-04-07 00:29:42.353595 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2026-04-07 00:29:42.353605 | orchestrator | Tuesday 07 April 2026 00:29:40 +0000 (0:00:00.703) 0:03:45.967 ********* 2026-04-07 00:29:42.353615 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:42.353625 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:42.353634 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:42.353660 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:42.353670 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:42.353679 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:42.353689 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:42.353698 | orchestrator | 2026-04-07 00:29:42.353712 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2026-04-07 00:29:42.353728 | orchestrator | Tuesday 07 April 2026 00:29:41 +0000 (0:00:00.768) 0:03:46.735 ********* 2026-04-07 00:29:42.353744 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:42.353758 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:42.353774 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:42.353790 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:42.353806 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:42.353823 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:42.353840 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:42.353855 | orchestrator | 2026-04-07 00:29:42.353873 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2026-04-07 00:29:42.353889 | orchestrator | Tuesday 07 April 2026 00:29:41 +0000 (0:00:00.617) 0:03:47.352 ********* 2026-04-07 00:29:42.353913 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520312.2263503, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:42.353932 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520318.4101722, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:42.353982 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520333.0612152, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:42.354130 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520343.527103, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923084 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520340.2262754, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923224 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520330.2136364, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923241 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1775520339.100179, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923253 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923265 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923276 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923288 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923333 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923366 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923378 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-07 00:29:47.923391 | orchestrator | 2026-04-07 00:29:47.923404 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2026-04-07 00:29:47.923418 | orchestrator | Tuesday 07 April 2026 00:29:42 +0000 (0:00:01.003) 0:03:48.356 ********* 2026-04-07 00:29:47.923430 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:47.923442 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:47.923453 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:47.923464 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:47.923474 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:47.923485 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:47.923496 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:47.923506 | orchestrator | 2026-04-07 00:29:47.923518 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2026-04-07 00:29:47.923529 | orchestrator | Tuesday 07 April 2026 00:29:43 +0000 (0:00:01.117) 0:03:49.473 ********* 2026-04-07 00:29:47.923539 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:47.923550 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:47.923561 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:47.923572 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:47.923582 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:47.923593 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:47.923604 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:47.923614 | orchestrator | 2026-04-07 00:29:47.923625 | orchestrator | TASK [osism.commons.motd : Copy issue.net file] ******************************** 2026-04-07 00:29:47.923636 | orchestrator | Tuesday 07 April 2026 00:29:45 +0000 (0:00:01.208) 0:03:50.682 ********* 2026-04-07 00:29:47.923647 | orchestrator | changed: [testbed-manager] 2026-04-07 00:29:47.923657 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:29:47.923668 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:29:47.923679 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:29:47.923689 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:29:47.923700 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:29:47.923710 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:29:47.923721 | orchestrator | 2026-04-07 00:29:47.923732 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2026-04-07 00:29:47.923743 | orchestrator | Tuesday 07 April 2026 00:29:46 +0000 (0:00:01.378) 0:03:52.061 ********* 2026-04-07 00:29:47.923753 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:29:47.923764 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:29:47.923775 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:29:47.923785 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:29:47.923796 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:29:47.923811 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:29:47.923822 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:29:47.923839 | orchestrator | 2026-04-07 00:29:47.923850 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2026-04-07 00:29:47.923861 | orchestrator | Tuesday 07 April 2026 00:29:46 +0000 (0:00:00.235) 0:03:52.296 ********* 2026-04-07 00:29:47.923871 | orchestrator | ok: [testbed-manager] 2026-04-07 00:29:47.923883 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:29:47.923894 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:29:47.923905 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:29:47.923916 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:29:47.923927 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:29:47.923937 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:29:47.923971 | orchestrator | 2026-04-07 00:29:47.923983 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2026-04-07 00:29:47.923994 | orchestrator | Tuesday 07 April 2026 00:29:47 +0000 (0:00:00.782) 0:03:53.079 ********* 2026-04-07 00:29:47.924007 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:29:47.924021 | orchestrator | 2026-04-07 00:29:47.924033 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2026-04-07 00:29:47.924052 | orchestrator | Tuesday 07 April 2026 00:29:47 +0000 (0:00:00.373) 0:03:53.452 ********* 2026-04-07 00:31:11.569155 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.569302 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:11.569329 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:11.569348 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:11.569365 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:11.569382 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:11.569399 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:11.569417 | orchestrator | 2026-04-07 00:31:11.569436 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2026-04-07 00:31:11.569455 | orchestrator | Tuesday 07 April 2026 00:29:57 +0000 (0:00:09.204) 0:04:02.657 ********* 2026-04-07 00:31:11.569472 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.569491 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.569510 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.569530 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.569550 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.569569 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.569588 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.569602 | orchestrator | 2026-04-07 00:31:11.569615 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2026-04-07 00:31:11.569629 | orchestrator | Tuesday 07 April 2026 00:29:58 +0000 (0:00:01.600) 0:04:04.257 ********* 2026-04-07 00:31:11.569641 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.569654 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.569666 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.569678 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.569690 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.569701 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.569711 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.569722 | orchestrator | 2026-04-07 00:31:11.569734 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2026-04-07 00:31:11.569746 | orchestrator | Tuesday 07 April 2026 00:29:59 +0000 (0:00:00.999) 0:04:05.257 ********* 2026-04-07 00:31:11.569757 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.569767 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.569778 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.569789 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.569799 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.569810 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.569821 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.569832 | orchestrator | 2026-04-07 00:31:11.569843 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2026-04-07 00:31:11.569882 | orchestrator | Tuesday 07 April 2026 00:30:00 +0000 (0:00:00.280) 0:04:05.538 ********* 2026-04-07 00:31:11.569894 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.569905 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.569916 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.569926 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.569937 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.569947 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.569958 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.569969 | orchestrator | 2026-04-07 00:31:11.570008 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2026-04-07 00:31:11.570142 | orchestrator | Tuesday 07 April 2026 00:30:00 +0000 (0:00:00.257) 0:04:05.796 ********* 2026-04-07 00:31:11.570158 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.570168 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.570179 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.570190 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.570200 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.570211 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.570222 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.570232 | orchestrator | 2026-04-07 00:31:11.570243 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2026-04-07 00:31:11.570321 | orchestrator | Tuesday 07 April 2026 00:30:00 +0000 (0:00:00.297) 0:04:06.093 ********* 2026-04-07 00:31:11.570341 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.570358 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.570375 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.570393 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.570410 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.570428 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.570446 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.570465 | orchestrator | 2026-04-07 00:31:11.570482 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2026-04-07 00:31:11.570500 | orchestrator | Tuesday 07 April 2026 00:30:06 +0000 (0:00:05.656) 0:04:11.750 ********* 2026-04-07 00:31:11.570521 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:31:11.570541 | orchestrator | 2026-04-07 00:31:11.570577 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2026-04-07 00:31:11.570595 | orchestrator | Tuesday 07 April 2026 00:30:06 +0000 (0:00:00.362) 0:04:12.112 ********* 2026-04-07 00:31:11.570612 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.570629 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2026-04-07 00:31:11.570645 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:11.570663 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.570680 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2026-04-07 00:31:11.570697 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.570715 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:11.570734 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2026-04-07 00:31:11.570751 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.570769 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2026-04-07 00:31:11.570788 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:11.570807 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.570825 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:11.570843 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2026-04-07 00:31:11.570862 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:11.570879 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.570929 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2026-04-07 00:31:11.570970 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:11.571034 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2026-04-07 00:31:11.571052 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2026-04-07 00:31:11.571063 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:11.571074 | orchestrator | 2026-04-07 00:31:11.571085 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2026-04-07 00:31:11.571096 | orchestrator | Tuesday 07 April 2026 00:30:06 +0000 (0:00:00.309) 0:04:12.421 ********* 2026-04-07 00:31:11.571107 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:31:11.571119 | orchestrator | 2026-04-07 00:31:11.571130 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2026-04-07 00:31:11.571141 | orchestrator | Tuesday 07 April 2026 00:30:07 +0000 (0:00:00.463) 0:04:12.885 ********* 2026-04-07 00:31:11.571151 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2026-04-07 00:31:11.571162 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2026-04-07 00:31:11.571173 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:11.571184 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:11.571194 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2026-04-07 00:31:11.571205 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2026-04-07 00:31:11.571216 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:11.571227 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2026-04-07 00:31:11.571237 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:11.571248 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:11.571259 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2026-04-07 00:31:11.571270 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:11.571281 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2026-04-07 00:31:11.571292 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:11.571302 | orchestrator | 2026-04-07 00:31:11.571313 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2026-04-07 00:31:11.571324 | orchestrator | Tuesday 07 April 2026 00:30:07 +0000 (0:00:00.286) 0:04:13.171 ********* 2026-04-07 00:31:11.571335 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:31:11.571346 | orchestrator | 2026-04-07 00:31:11.571357 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2026-04-07 00:31:11.571368 | orchestrator | Tuesday 07 April 2026 00:30:08 +0000 (0:00:00.377) 0:04:13.549 ********* 2026-04-07 00:31:11.571378 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:11.571389 | orchestrator | changed: [testbed-manager] 2026-04-07 00:31:11.571400 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:11.571411 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:11.571421 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:11.571432 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:11.571443 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:11.571454 | orchestrator | 2026-04-07 00:31:11.571465 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2026-04-07 00:31:11.571475 | orchestrator | Tuesday 07 April 2026 00:30:42 +0000 (0:00:34.544) 0:04:48.093 ********* 2026-04-07 00:31:11.571486 | orchestrator | changed: [testbed-manager] 2026-04-07 00:31:11.571497 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:11.571507 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:11.571518 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:11.571529 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:11.571548 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:11.571559 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:11.571570 | orchestrator | 2026-04-07 00:31:11.571580 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2026-04-07 00:31:11.571591 | orchestrator | Tuesday 07 April 2026 00:30:52 +0000 (0:00:10.068) 0:04:58.162 ********* 2026-04-07 00:31:11.571602 | orchestrator | changed: [testbed-manager] 2026-04-07 00:31:11.571613 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:11.571632 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:11.571643 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:11.571654 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:11.571665 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:11.571675 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:11.571686 | orchestrator | 2026-04-07 00:31:11.571697 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2026-04-07 00:31:11.571708 | orchestrator | Tuesday 07 April 2026 00:31:02 +0000 (0:00:09.387) 0:05:07.549 ********* 2026-04-07 00:31:11.571718 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:11.571729 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:11.571740 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:11.571751 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:11.571761 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:11.571772 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:11.571783 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:11.571793 | orchestrator | 2026-04-07 00:31:11.571804 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2026-04-07 00:31:11.571816 | orchestrator | Tuesday 07 April 2026 00:31:03 +0000 (0:00:01.918) 0:05:09.468 ********* 2026-04-07 00:31:11.571827 | orchestrator | changed: [testbed-manager] 2026-04-07 00:31:11.571838 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:11.571848 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:11.571859 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:11.571870 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:11.571881 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:11.571891 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:11.571902 | orchestrator | 2026-04-07 00:31:11.571921 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2026-04-07 00:31:23.532906 | orchestrator | Tuesday 07 April 2026 00:31:11 +0000 (0:00:07.624) 0:05:17.093 ********* 2026-04-07 00:31:23.533104 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:31:23.533126 | orchestrator | 2026-04-07 00:31:23.533140 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2026-04-07 00:31:23.533151 | orchestrator | Tuesday 07 April 2026 00:31:11 +0000 (0:00:00.397) 0:05:17.490 ********* 2026-04-07 00:31:23.533163 | orchestrator | changed: [testbed-manager] 2026-04-07 00:31:23.533175 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:23.533186 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:23.533197 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:23.533207 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:23.533218 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:23.533229 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:23.533239 | orchestrator | 2026-04-07 00:31:23.533251 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2026-04-07 00:31:23.533262 | orchestrator | Tuesday 07 April 2026 00:31:12 +0000 (0:00:00.802) 0:05:18.292 ********* 2026-04-07 00:31:23.533272 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:23.533284 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:23.533295 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:23.533306 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:23.533317 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:23.533328 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:23.533365 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:23.533377 | orchestrator | 2026-04-07 00:31:23.533387 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2026-04-07 00:31:23.533399 | orchestrator | Tuesday 07 April 2026 00:31:14 +0000 (0:00:02.228) 0:05:20.521 ********* 2026-04-07 00:31:23.533411 | orchestrator | changed: [testbed-manager] 2026-04-07 00:31:23.533422 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:31:23.533433 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:31:23.533446 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:31:23.533459 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:31:23.533473 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:31:23.533484 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:31:23.533497 | orchestrator | 2026-04-07 00:31:23.533514 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2026-04-07 00:31:23.533534 | orchestrator | Tuesday 07 April 2026 00:31:15 +0000 (0:00:00.981) 0:05:21.503 ********* 2026-04-07 00:31:23.533553 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:23.533573 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:23.533590 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:23.533608 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:23.533627 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:23.533645 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:23.533663 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:23.533683 | orchestrator | 2026-04-07 00:31:23.533703 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2026-04-07 00:31:23.533723 | orchestrator | Tuesday 07 April 2026 00:31:16 +0000 (0:00:00.260) 0:05:21.763 ********* 2026-04-07 00:31:23.533742 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:23.533760 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:23.533778 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:23.533796 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:23.533814 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:23.533832 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:23.533846 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:23.533856 | orchestrator | 2026-04-07 00:31:23.533867 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2026-04-07 00:31:23.533878 | orchestrator | Tuesday 07 April 2026 00:31:16 +0000 (0:00:00.375) 0:05:22.139 ********* 2026-04-07 00:31:23.533889 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:23.533900 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:23.533911 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:23.533921 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:23.533932 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:23.533942 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:23.533953 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:23.533963 | orchestrator | 2026-04-07 00:31:23.533974 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2026-04-07 00:31:23.534103 | orchestrator | Tuesday 07 April 2026 00:31:16 +0000 (0:00:00.369) 0:05:22.508 ********* 2026-04-07 00:31:23.534120 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:23.534131 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:23.534142 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:23.534153 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:23.534163 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:23.534174 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:23.534185 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:23.534196 | orchestrator | 2026-04-07 00:31:23.534207 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2026-04-07 00:31:23.534219 | orchestrator | Tuesday 07 April 2026 00:31:17 +0000 (0:00:00.246) 0:05:22.755 ********* 2026-04-07 00:31:23.534230 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:23.534241 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:23.534252 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:23.534263 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:23.534287 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:23.534298 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:23.534309 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:23.534319 | orchestrator | 2026-04-07 00:31:23.534331 | orchestrator | TASK [osism.services.docker : Print used docker version] *********************** 2026-04-07 00:31:23.534351 | orchestrator | Tuesday 07 April 2026 00:31:17 +0000 (0:00:00.288) 0:05:23.044 ********* 2026-04-07 00:31:23.534369 | orchestrator | ok: [testbed-manager] =>  2026-04-07 00:31:23.534387 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534405 | orchestrator | ok: [testbed-node-0] =>  2026-04-07 00:31:23.534422 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534440 | orchestrator | ok: [testbed-node-1] =>  2026-04-07 00:31:23.534459 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534477 | orchestrator | ok: [testbed-node-2] =>  2026-04-07 00:31:23.534496 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534531 | orchestrator | ok: [testbed-node-3] =>  2026-04-07 00:31:23.534543 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534554 | orchestrator | ok: [testbed-node-4] =>  2026-04-07 00:31:23.534564 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534575 | orchestrator | ok: [testbed-node-5] =>  2026-04-07 00:31:23.534586 | orchestrator |  docker_version: 5:27.5.1 2026-04-07 00:31:23.534597 | orchestrator | 2026-04-07 00:31:23.534607 | orchestrator | TASK [osism.services.docker : Print used docker cli version] ******************* 2026-04-07 00:31:23.534618 | orchestrator | Tuesday 07 April 2026 00:31:17 +0000 (0:00:00.253) 0:05:23.298 ********* 2026-04-07 00:31:23.534629 | orchestrator | ok: [testbed-manager] =>  2026-04-07 00:31:23.534640 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534651 | orchestrator | ok: [testbed-node-0] =>  2026-04-07 00:31:23.534661 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534672 | orchestrator | ok: [testbed-node-1] =>  2026-04-07 00:31:23.534682 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534693 | orchestrator | ok: [testbed-node-2] =>  2026-04-07 00:31:23.534705 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534723 | orchestrator | ok: [testbed-node-3] =>  2026-04-07 00:31:23.534742 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534759 | orchestrator | ok: [testbed-node-4] =>  2026-04-07 00:31:23.534777 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534793 | orchestrator | ok: [testbed-node-5] =>  2026-04-07 00:31:23.534810 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-07 00:31:23.534827 | orchestrator | 2026-04-07 00:31:23.534845 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2026-04-07 00:31:23.534862 | orchestrator | Tuesday 07 April 2026 00:31:18 +0000 (0:00:00.248) 0:05:23.547 ********* 2026-04-07 00:31:23.534879 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:23.534895 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:23.534913 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:23.534931 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:23.534949 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:23.534969 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:23.535017 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:23.535031 | orchestrator | 2026-04-07 00:31:23.535042 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2026-04-07 00:31:23.535053 | orchestrator | Tuesday 07 April 2026 00:31:18 +0000 (0:00:00.269) 0:05:23.817 ********* 2026-04-07 00:31:23.535063 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:23.535074 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:23.535084 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:23.535095 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:31:23.535106 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:31:23.535117 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:31:23.535127 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:31:23.535138 | orchestrator | 2026-04-07 00:31:23.535149 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2026-04-07 00:31:23.535171 | orchestrator | Tuesday 07 April 2026 00:31:18 +0000 (0:00:00.243) 0:05:24.060 ********* 2026-04-07 00:31:23.535185 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:31:23.535199 | orchestrator | 2026-04-07 00:31:23.535210 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2026-04-07 00:31:23.535220 | orchestrator | Tuesday 07 April 2026 00:31:18 +0000 (0:00:00.418) 0:05:24.478 ********* 2026-04-07 00:31:23.535231 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:23.535242 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:23.535253 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:23.535264 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:23.535274 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:23.535285 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:23.535296 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:23.535306 | orchestrator | 2026-04-07 00:31:23.535370 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2026-04-07 00:31:23.535389 | orchestrator | Tuesday 07 April 2026 00:31:19 +0000 (0:00:00.898) 0:05:25.377 ********* 2026-04-07 00:31:23.535405 | orchestrator | ok: [testbed-manager] 2026-04-07 00:31:23.535423 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:31:23.535442 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:31:23.535460 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:31:23.535477 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:31:23.535489 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:31:23.535499 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:31:23.535510 | orchestrator | 2026-04-07 00:31:23.535527 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2026-04-07 00:31:23.535540 | orchestrator | Tuesday 07 April 2026 00:31:23 +0000 (0:00:03.343) 0:05:28.721 ********* 2026-04-07 00:31:23.535550 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2026-04-07 00:31:23.535562 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2026-04-07 00:31:23.535573 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2026-04-07 00:31:23.535584 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:31:23.535625 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2026-04-07 00:31:23.535637 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2026-04-07 00:31:23.535648 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2026-04-07 00:31:23.535659 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:31:23.535669 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2026-04-07 00:31:23.535680 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2026-04-07 00:31:23.535691 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2026-04-07 00:31:23.535701 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:31:23.535713 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2026-04-07 00:31:23.535723 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2026-04-07 00:31:23.535734 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2026-04-07 00:31:23.535745 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2026-04-07 00:31:23.535769 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2026-04-07 00:32:30.902394 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2026-04-07 00:32:30.902540 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:30.902568 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2026-04-07 00:32:30.902627 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2026-04-07 00:32:30.902648 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:30.902667 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2026-04-07 00:32:30.902686 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:30.902738 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2026-04-07 00:32:30.902759 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2026-04-07 00:32:30.902778 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2026-04-07 00:32:30.902796 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:30.902816 | orchestrator | 2026-04-07 00:32:30.902836 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2026-04-07 00:32:30.902858 | orchestrator | Tuesday 07 April 2026 00:31:23 +0000 (0:00:00.570) 0:05:29.291 ********* 2026-04-07 00:32:30.902876 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.902895 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.902914 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.902934 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.902953 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.902999 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.903020 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.903039 | orchestrator | 2026-04-07 00:32:30.903058 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2026-04-07 00:32:30.903076 | orchestrator | Tuesday 07 April 2026 00:31:31 +0000 (0:00:07.818) 0:05:37.110 ********* 2026-04-07 00:32:30.903094 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.903113 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.903132 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.903151 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.903170 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.903189 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.903208 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.903226 | orchestrator | 2026-04-07 00:32:30.903246 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2026-04-07 00:32:30.903264 | orchestrator | Tuesday 07 April 2026 00:31:32 +0000 (0:00:01.084) 0:05:38.194 ********* 2026-04-07 00:32:30.903283 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.903302 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.903320 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.903339 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.903358 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.903377 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.903397 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.903416 | orchestrator | 2026-04-07 00:32:30.903435 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2026-04-07 00:32:30.903454 | orchestrator | Tuesday 07 April 2026 00:31:42 +0000 (0:00:10.222) 0:05:48.417 ********* 2026-04-07 00:32:30.903473 | orchestrator | changed: [testbed-manager] 2026-04-07 00:32:30.903491 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.903510 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.903529 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.903547 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.903565 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.903584 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.903603 | orchestrator | 2026-04-07 00:32:30.903622 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2026-04-07 00:32:30.903641 | orchestrator | Tuesday 07 April 2026 00:31:46 +0000 (0:00:03.485) 0:05:51.903 ********* 2026-04-07 00:32:30.903659 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.903677 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.903695 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.903714 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.903733 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.903752 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.903770 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.903788 | orchestrator | 2026-04-07 00:32:30.903806 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2026-04-07 00:32:30.903826 | orchestrator | Tuesday 07 April 2026 00:31:47 +0000 (0:00:01.267) 0:05:53.170 ********* 2026-04-07 00:32:30.903861 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.903879 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.903896 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.903915 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.903935 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.903954 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.904005 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.904028 | orchestrator | 2026-04-07 00:32:30.904047 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2026-04-07 00:32:30.904066 | orchestrator | Tuesday 07 April 2026 00:31:48 +0000 (0:00:01.318) 0:05:54.488 ********* 2026-04-07 00:32:30.904085 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:30.904103 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:30.904121 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:30.904140 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:30.904158 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:30.904177 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:30.904195 | orchestrator | changed: [testbed-manager] 2026-04-07 00:32:30.904213 | orchestrator | 2026-04-07 00:32:30.904231 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2026-04-07 00:32:30.904250 | orchestrator | Tuesday 07 April 2026 00:31:49 +0000 (0:00:00.599) 0:05:55.087 ********* 2026-04-07 00:32:30.904269 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.904287 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.904307 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.904326 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.904344 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.904363 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.904382 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.904418 | orchestrator | 2026-04-07 00:32:30.904450 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2026-04-07 00:32:30.904494 | orchestrator | Tuesday 07 April 2026 00:32:01 +0000 (0:00:11.807) 0:06:06.894 ********* 2026-04-07 00:32:30.904515 | orchestrator | changed: [testbed-manager] 2026-04-07 00:32:30.904534 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.904551 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.904571 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.904590 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.904610 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.904628 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.904646 | orchestrator | 2026-04-07 00:32:30.904665 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2026-04-07 00:32:30.904684 | orchestrator | Tuesday 07 April 2026 00:32:02 +0000 (0:00:01.119) 0:06:08.014 ********* 2026-04-07 00:32:30.904703 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.904722 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.904739 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.904757 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.904776 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.904795 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.904814 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.904833 | orchestrator | 2026-04-07 00:32:30.904851 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2026-04-07 00:32:30.904870 | orchestrator | Tuesday 07 April 2026 00:32:12 +0000 (0:00:10.003) 0:06:18.018 ********* 2026-04-07 00:32:30.904889 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.904908 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.904927 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.904945 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.904963 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.905047 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.905068 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.905085 | orchestrator | 2026-04-07 00:32:30.905104 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2026-04-07 00:32:30.905137 | orchestrator | Tuesday 07 April 2026 00:32:24 +0000 (0:00:11.836) 0:06:29.854 ********* 2026-04-07 00:32:30.905155 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2026-04-07 00:32:30.905172 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2026-04-07 00:32:30.905187 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2026-04-07 00:32:30.905203 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2026-04-07 00:32:30.905220 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2026-04-07 00:32:30.905237 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2026-04-07 00:32:30.905253 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2026-04-07 00:32:30.905270 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2026-04-07 00:32:30.905285 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2026-04-07 00:32:30.905301 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2026-04-07 00:32:30.905318 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2026-04-07 00:32:30.905334 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2026-04-07 00:32:30.905351 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2026-04-07 00:32:30.905366 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2026-04-07 00:32:30.905383 | orchestrator | 2026-04-07 00:32:30.905400 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2026-04-07 00:32:30.905417 | orchestrator | Tuesday 07 April 2026 00:32:25 +0000 (0:00:01.212) 0:06:31.067 ********* 2026-04-07 00:32:30.905433 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:30.905449 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:30.905465 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:30.905481 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:30.905498 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:30.905514 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:30.905531 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:30.905546 | orchestrator | 2026-04-07 00:32:30.905562 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2026-04-07 00:32:30.905579 | orchestrator | Tuesday 07 April 2026 00:32:26 +0000 (0:00:00.641) 0:06:31.708 ********* 2026-04-07 00:32:30.905595 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:30.905611 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:30.905628 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:30.905644 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:30.905660 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:30.905675 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:30.905692 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:30.905708 | orchestrator | 2026-04-07 00:32:30.905726 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2026-04-07 00:32:30.905761 | orchestrator | Tuesday 07 April 2026 00:32:30 +0000 (0:00:03.981) 0:06:35.690 ********* 2026-04-07 00:32:30.905777 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:30.905793 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:30.905809 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:30.905826 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:30.905842 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:30.905858 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:30.905874 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:30.905890 | orchestrator | 2026-04-07 00:32:30.905907 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2026-04-07 00:32:30.905922 | orchestrator | Tuesday 07 April 2026 00:32:30 +0000 (0:00:00.494) 0:06:36.184 ********* 2026-04-07 00:32:30.905939 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2026-04-07 00:32:30.905955 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2026-04-07 00:32:30.905994 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:30.906088 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2026-04-07 00:32:30.906110 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2026-04-07 00:32:30.906127 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:30.906142 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2026-04-07 00:32:30.906159 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2026-04-07 00:32:30.906177 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:30.906206 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2026-04-07 00:32:51.200213 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2026-04-07 00:32:51.200311 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:51.200323 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2026-04-07 00:32:51.200331 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2026-04-07 00:32:51.200340 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:51.200348 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2026-04-07 00:32:51.200356 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2026-04-07 00:32:51.200364 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:51.200372 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2026-04-07 00:32:51.200380 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2026-04-07 00:32:51.200388 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:51.200396 | orchestrator | 2026-04-07 00:32:51.200405 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2026-04-07 00:32:51.200414 | orchestrator | Tuesday 07 April 2026 00:32:31 +0000 (0:00:00.540) 0:06:36.725 ********* 2026-04-07 00:32:51.200423 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:51.200431 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:51.200438 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:51.200446 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:51.200454 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:51.200462 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:51.200470 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:51.200477 | orchestrator | 2026-04-07 00:32:51.200486 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2026-04-07 00:32:51.200494 | orchestrator | Tuesday 07 April 2026 00:32:31 +0000 (0:00:00.459) 0:06:37.185 ********* 2026-04-07 00:32:51.200502 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:51.200509 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:51.200517 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:51.200525 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:51.200533 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:51.200541 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:51.200549 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:51.200556 | orchestrator | 2026-04-07 00:32:51.200564 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2026-04-07 00:32:51.200572 | orchestrator | Tuesday 07 April 2026 00:32:32 +0000 (0:00:00.600) 0:06:37.785 ********* 2026-04-07 00:32:51.200580 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:51.200588 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:32:51.200596 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:32:51.200604 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:32:51.200611 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:32:51.200619 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:32:51.200627 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:32:51.200635 | orchestrator | 2026-04-07 00:32:51.200643 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2026-04-07 00:32:51.200651 | orchestrator | Tuesday 07 April 2026 00:32:32 +0000 (0:00:00.504) 0:06:38.290 ********* 2026-04-07 00:32:51.200658 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.200667 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:32:51.200699 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:32:51.200708 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:32:51.200715 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:32:51.200723 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:32:51.200731 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:32:51.200739 | orchestrator | 2026-04-07 00:32:51.200747 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2026-04-07 00:32:51.200755 | orchestrator | Tuesday 07 April 2026 00:32:34 +0000 (0:00:02.072) 0:06:40.362 ********* 2026-04-07 00:32:51.200764 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:32:51.200775 | orchestrator | 2026-04-07 00:32:51.200785 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2026-04-07 00:32:51.200794 | orchestrator | Tuesday 07 April 2026 00:32:35 +0000 (0:00:00.857) 0:06:41.220 ********* 2026-04-07 00:32:51.200803 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.200812 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:51.200821 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:51.200843 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:51.200852 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:51.200861 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:51.200870 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:51.200879 | orchestrator | 2026-04-07 00:32:51.200888 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2026-04-07 00:32:51.200898 | orchestrator | Tuesday 07 April 2026 00:32:36 +0000 (0:00:01.164) 0:06:42.384 ********* 2026-04-07 00:32:51.200908 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.200917 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:51.200927 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:51.200936 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:51.200945 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:51.200978 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:51.200987 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:51.200996 | orchestrator | 2026-04-07 00:32:51.201005 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2026-04-07 00:32:51.201013 | orchestrator | Tuesday 07 April 2026 00:32:37 +0000 (0:00:00.900) 0:06:43.285 ********* 2026-04-07 00:32:51.201023 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.201032 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:51.201040 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:51.201049 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:51.201058 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:51.201067 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:51.201076 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:51.201085 | orchestrator | 2026-04-07 00:32:51.201094 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2026-04-07 00:32:51.201117 | orchestrator | Tuesday 07 April 2026 00:32:39 +0000 (0:00:01.416) 0:06:44.702 ********* 2026-04-07 00:32:51.201126 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:32:51.201136 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:32:51.201145 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:32:51.201154 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:32:51.201164 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:32:51.201173 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:32:51.201181 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:32:51.201189 | orchestrator | 2026-04-07 00:32:51.201197 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2026-04-07 00:32:51.201205 | orchestrator | Tuesday 07 April 2026 00:32:40 +0000 (0:00:01.677) 0:06:46.379 ********* 2026-04-07 00:32:51.201213 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.201221 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:51.201228 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:51.201236 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:51.201251 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:51.201259 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:51.201267 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:51.201275 | orchestrator | 2026-04-07 00:32:51.201282 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2026-04-07 00:32:51.201290 | orchestrator | Tuesday 07 April 2026 00:32:42 +0000 (0:00:01.447) 0:06:47.827 ********* 2026-04-07 00:32:51.201298 | orchestrator | changed: [testbed-manager] 2026-04-07 00:32:51.201306 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:32:51.201314 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:32:51.201322 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:32:51.201329 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:32:51.201337 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:32:51.201345 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:32:51.201353 | orchestrator | 2026-04-07 00:32:51.201361 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2026-04-07 00:32:51.201369 | orchestrator | Tuesday 07 April 2026 00:32:43 +0000 (0:00:01.594) 0:06:49.422 ********* 2026-04-07 00:32:51.201377 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:32:51.201385 | orchestrator | 2026-04-07 00:32:51.201393 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2026-04-07 00:32:51.201401 | orchestrator | Tuesday 07 April 2026 00:32:44 +0000 (0:00:00.820) 0:06:50.243 ********* 2026-04-07 00:32:51.201409 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.201416 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:32:51.201424 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:32:51.201432 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:32:51.201440 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:32:51.201447 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:32:51.201455 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:32:51.201463 | orchestrator | 2026-04-07 00:32:51.201471 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2026-04-07 00:32:51.201479 | orchestrator | Tuesday 07 April 2026 00:32:46 +0000 (0:00:01.513) 0:06:51.756 ********* 2026-04-07 00:32:51.201487 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.201495 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:32:51.201502 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:32:51.201510 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:32:51.201518 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:32:51.201526 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:32:51.201534 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:32:51.201541 | orchestrator | 2026-04-07 00:32:51.201549 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2026-04-07 00:32:51.201557 | orchestrator | Tuesday 07 April 2026 00:32:47 +0000 (0:00:01.453) 0:06:53.210 ********* 2026-04-07 00:32:51.201565 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.201573 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:32:51.201581 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:32:51.201588 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:32:51.201596 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:32:51.201604 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:32:51.201612 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:32:51.201619 | orchestrator | 2026-04-07 00:32:51.201627 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2026-04-07 00:32:51.201635 | orchestrator | Tuesday 07 April 2026 00:32:48 +0000 (0:00:01.192) 0:06:54.403 ********* 2026-04-07 00:32:51.201643 | orchestrator | ok: [testbed-manager] 2026-04-07 00:32:51.201651 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:32:51.201659 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:32:51.201667 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:32:51.201674 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:32:51.201682 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:32:51.201697 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:32:51.201704 | orchestrator | 2026-04-07 00:32:51.201712 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2026-04-07 00:32:51.201720 | orchestrator | Tuesday 07 April 2026 00:32:50 +0000 (0:00:01.239) 0:06:55.642 ********* 2026-04-07 00:32:51.201728 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:32:51.201736 | orchestrator | 2026-04-07 00:32:51.201744 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:32:51.201752 | orchestrator | Tuesday 07 April 2026 00:32:50 +0000 (0:00:00.831) 0:06:56.474 ********* 2026-04-07 00:32:51.201760 | orchestrator | 2026-04-07 00:32:51.201768 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:32:51.201776 | orchestrator | Tuesday 07 April 2026 00:32:50 +0000 (0:00:00.039) 0:06:56.513 ********* 2026-04-07 00:32:51.201784 | orchestrator | 2026-04-07 00:32:51.201792 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:32:51.201800 | orchestrator | Tuesday 07 April 2026 00:32:51 +0000 (0:00:00.173) 0:06:56.686 ********* 2026-04-07 00:32:51.201808 | orchestrator | 2026-04-07 00:32:51.201815 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:32:51.201828 | orchestrator | Tuesday 07 April 2026 00:32:51 +0000 (0:00:00.040) 0:06:56.727 ********* 2026-04-07 00:33:18.334462 | orchestrator | 2026-04-07 00:33:18.334564 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:33:18.334585 | orchestrator | Tuesday 07 April 2026 00:32:51 +0000 (0:00:00.041) 0:06:56.768 ********* 2026-04-07 00:33:18.334601 | orchestrator | 2026-04-07 00:33:18.334615 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:33:18.334631 | orchestrator | Tuesday 07 April 2026 00:32:51 +0000 (0:00:00.044) 0:06:56.812 ********* 2026-04-07 00:33:18.334646 | orchestrator | 2026-04-07 00:33:18.334661 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-07 00:33:18.334675 | orchestrator | Tuesday 07 April 2026 00:32:51 +0000 (0:00:00.039) 0:06:56.851 ********* 2026-04-07 00:33:18.334690 | orchestrator | 2026-04-07 00:33:18.334704 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-04-07 00:33:18.334719 | orchestrator | Tuesday 07 April 2026 00:32:51 +0000 (0:00:00.039) 0:06:56.891 ********* 2026-04-07 00:33:18.334734 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:18.334749 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:18.334764 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:18.334778 | orchestrator | 2026-04-07 00:33:18.334793 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2026-04-07 00:33:18.334807 | orchestrator | Tuesday 07 April 2026 00:32:52 +0000 (0:00:01.261) 0:06:58.152 ********* 2026-04-07 00:33:18.334822 | orchestrator | changed: [testbed-manager] 2026-04-07 00:33:18.334837 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:18.334852 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:18.334867 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:18.334881 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:18.334894 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:18.334908 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:18.334921 | orchestrator | 2026-04-07 00:33:18.334983 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart logrotate service] *********** 2026-04-07 00:33:18.334995 | orchestrator | Tuesday 07 April 2026 00:32:53 +0000 (0:00:01.309) 0:06:59.462 ********* 2026-04-07 00:33:18.335008 | orchestrator | changed: [testbed-manager] 2026-04-07 00:33:18.335020 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:18.335032 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:18.335044 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:18.335057 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:18.335069 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:18.335105 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:18.335117 | orchestrator | 2026-04-07 00:33:18.335130 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2026-04-07 00:33:18.335142 | orchestrator | Tuesday 07 April 2026 00:32:55 +0000 (0:00:01.203) 0:07:00.666 ********* 2026-04-07 00:33:18.335154 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:18.335166 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:18.335178 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:18.335191 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:18.335205 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:18.335217 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:18.335231 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:18.335242 | orchestrator | 2026-04-07 00:33:18.335254 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2026-04-07 00:33:18.335266 | orchestrator | Tuesday 07 April 2026 00:32:57 +0000 (0:00:02.488) 0:07:03.154 ********* 2026-04-07 00:33:18.335278 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:18.335290 | orchestrator | 2026-04-07 00:33:18.335302 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2026-04-07 00:33:18.335315 | orchestrator | Tuesday 07 April 2026 00:32:57 +0000 (0:00:00.092) 0:07:03.246 ********* 2026-04-07 00:33:18.335327 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:18.335338 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:18.335349 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:18.335360 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:18.335372 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:18.335398 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:18.335409 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:18.335419 | orchestrator | 2026-04-07 00:33:18.335430 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2026-04-07 00:33:18.335441 | orchestrator | Tuesday 07 April 2026 00:32:59 +0000 (0:00:01.345) 0:07:04.592 ********* 2026-04-07 00:33:18.335451 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:18.335462 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:18.335472 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:18.335483 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:18.335498 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:18.335509 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:18.335519 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:18.335529 | orchestrator | 2026-04-07 00:33:18.335540 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2026-04-07 00:33:18.335551 | orchestrator | Tuesday 07 April 2026 00:32:59 +0000 (0:00:00.529) 0:07:05.121 ********* 2026-04-07 00:33:18.335562 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:33:18.335575 | orchestrator | 2026-04-07 00:33:18.335586 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2026-04-07 00:33:18.335598 | orchestrator | Tuesday 07 April 2026 00:33:00 +0000 (0:00:00.851) 0:07:05.973 ********* 2026-04-07 00:33:18.335610 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:18.335622 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:18.335634 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:18.335646 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:18.335658 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:18.335670 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:18.335682 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:18.335694 | orchestrator | 2026-04-07 00:33:18.335706 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2026-04-07 00:33:18.335719 | orchestrator | Tuesday 07 April 2026 00:33:01 +0000 (0:00:00.953) 0:07:06.926 ********* 2026-04-07 00:33:18.335731 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2026-04-07 00:33:18.335773 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2026-04-07 00:33:18.335786 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2026-04-07 00:33:18.335798 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2026-04-07 00:33:18.335810 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2026-04-07 00:33:18.335822 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2026-04-07 00:33:18.335833 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2026-04-07 00:33:18.335844 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2026-04-07 00:33:18.335855 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2026-04-07 00:33:18.335866 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2026-04-07 00:33:18.335878 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2026-04-07 00:33:18.335889 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2026-04-07 00:33:18.335902 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2026-04-07 00:33:18.335913 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2026-04-07 00:33:18.335946 | orchestrator | 2026-04-07 00:33:18.335958 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2026-04-07 00:33:18.335970 | orchestrator | Tuesday 07 April 2026 00:33:03 +0000 (0:00:02.446) 0:07:09.373 ********* 2026-04-07 00:33:18.335981 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:18.335993 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:18.336004 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:18.336016 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:18.336028 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:18.336062 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:18.336074 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:18.336085 | orchestrator | 2026-04-07 00:33:18.336097 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2026-04-07 00:33:18.336108 | orchestrator | Tuesday 07 April 2026 00:33:04 +0000 (0:00:00.472) 0:07:09.845 ********* 2026-04-07 00:33:18.336121 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:33:18.336135 | orchestrator | 2026-04-07 00:33:18.336146 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2026-04-07 00:33:18.336157 | orchestrator | Tuesday 07 April 2026 00:33:05 +0000 (0:00:00.956) 0:07:10.802 ********* 2026-04-07 00:33:18.336168 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:18.336179 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:18.336191 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:18.336203 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:18.336213 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:18.336225 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:18.336235 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:18.336247 | orchestrator | 2026-04-07 00:33:18.336260 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2026-04-07 00:33:18.336271 | orchestrator | Tuesday 07 April 2026 00:33:06 +0000 (0:00:00.889) 0:07:11.691 ********* 2026-04-07 00:33:18.336282 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:18.336294 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:18.336306 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:18.336317 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:18.336329 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:18.336341 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:18.336352 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:18.336364 | orchestrator | 2026-04-07 00:33:18.336376 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2026-04-07 00:33:18.336388 | orchestrator | Tuesday 07 April 2026 00:33:07 +0000 (0:00:00.920) 0:07:12.612 ********* 2026-04-07 00:33:18.336401 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:18.336424 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:18.336437 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:18.336450 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:18.336463 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:18.336475 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:18.336487 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:18.336500 | orchestrator | 2026-04-07 00:33:18.336513 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2026-04-07 00:33:18.336533 | orchestrator | Tuesday 07 April 2026 00:33:07 +0000 (0:00:00.472) 0:07:13.084 ********* 2026-04-07 00:33:18.336546 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:18.336558 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:18.336571 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:18.336583 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:18.336595 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:18.336607 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:18.336620 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:18.336632 | orchestrator | 2026-04-07 00:33:18.336644 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2026-04-07 00:33:18.336657 | orchestrator | Tuesday 07 April 2026 00:33:09 +0000 (0:00:01.556) 0:07:14.640 ********* 2026-04-07 00:33:18.336669 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:18.336681 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:18.336694 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:18.336706 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:18.336719 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:18.336730 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:18.336742 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:18.336753 | orchestrator | 2026-04-07 00:33:18.336764 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2026-04-07 00:33:18.336776 | orchestrator | Tuesday 07 April 2026 00:33:09 +0000 (0:00:00.551) 0:07:15.191 ********* 2026-04-07 00:33:18.336788 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:18.336799 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:18.336811 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:18.336822 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:18.336833 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:18.336844 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:18.336867 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:51.107437 | orchestrator | 2026-04-07 00:33:51.107596 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2026-04-07 00:33:51.107629 | orchestrator | Tuesday 07 April 2026 00:33:18 +0000 (0:00:08.744) 0:07:23.936 ********* 2026-04-07 00:33:51.107651 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.107672 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:51.107693 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:51.107713 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:51.107732 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:51.107753 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:51.107774 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:51.107793 | orchestrator | 2026-04-07 00:33:51.107812 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2026-04-07 00:33:51.107832 | orchestrator | Tuesday 07 April 2026 00:33:19 +0000 (0:00:01.400) 0:07:25.337 ********* 2026-04-07 00:33:51.107851 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.107908 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:51.107945 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:51.107965 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:51.107984 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:51.108004 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:51.108025 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:51.108044 | orchestrator | 2026-04-07 00:33:51.108063 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2026-04-07 00:33:51.108119 | orchestrator | Tuesday 07 April 2026 00:33:21 +0000 (0:00:01.802) 0:07:27.140 ********* 2026-04-07 00:33:51.108139 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.108158 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:51.108178 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:51.108196 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:51.108215 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:51.108234 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:51.108253 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:51.108272 | orchestrator | 2026-04-07 00:33:51.108291 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-04-07 00:33:51.108309 | orchestrator | Tuesday 07 April 2026 00:33:23 +0000 (0:00:01.820) 0:07:28.960 ********* 2026-04-07 00:33:51.108327 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.108347 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.108366 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.108385 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.108403 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.108423 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.108441 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.108460 | orchestrator | 2026-04-07 00:33:51.108512 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-04-07 00:33:51.108532 | orchestrator | Tuesday 07 April 2026 00:33:24 +0000 (0:00:00.822) 0:07:29.783 ********* 2026-04-07 00:33:51.108550 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:51.108568 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:51.108586 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:51.108601 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:51.108619 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:51.108637 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:51.108655 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:51.108672 | orchestrator | 2026-04-07 00:33:51.108691 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2026-04-07 00:33:51.108709 | orchestrator | Tuesday 07 April 2026 00:33:25 +0000 (0:00:00.770) 0:07:30.553 ********* 2026-04-07 00:33:51.108728 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:51.108747 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:51.108764 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:51.108782 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:51.108799 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:51.108817 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:51.108835 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:51.108853 | orchestrator | 2026-04-07 00:33:51.108870 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2026-04-07 00:33:51.108920 | orchestrator | Tuesday 07 April 2026 00:33:25 +0000 (0:00:00.631) 0:07:31.184 ********* 2026-04-07 00:33:51.108939 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.108956 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.108977 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.108995 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.109013 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.109033 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.109054 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.109074 | orchestrator | 2026-04-07 00:33:51.109095 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2026-04-07 00:33:51.109136 | orchestrator | Tuesday 07 April 2026 00:33:26 +0000 (0:00:00.492) 0:07:31.676 ********* 2026-04-07 00:33:51.109157 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.109177 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.109197 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.109217 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.109237 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.109256 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.109277 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.109296 | orchestrator | 2026-04-07 00:33:51.109316 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2026-04-07 00:33:51.109355 | orchestrator | Tuesday 07 April 2026 00:33:26 +0000 (0:00:00.487) 0:07:32.164 ********* 2026-04-07 00:33:51.109375 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.109394 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.109415 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.109436 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.109455 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.109474 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.109493 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.109510 | orchestrator | 2026-04-07 00:33:51.109526 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2026-04-07 00:33:51.109543 | orchestrator | Tuesday 07 April 2026 00:33:27 +0000 (0:00:00.488) 0:07:32.653 ********* 2026-04-07 00:33:51.109561 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.109578 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.109595 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.109613 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.109631 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.109647 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.109664 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.109680 | orchestrator | 2026-04-07 00:33:51.109725 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2026-04-07 00:33:51.109743 | orchestrator | Tuesday 07 April 2026 00:33:32 +0000 (0:00:05.427) 0:07:38.080 ********* 2026-04-07 00:33:51.109761 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:33:51.109778 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:33:51.109795 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:33:51.109812 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:33:51.109830 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:33:51.109847 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:33:51.109864 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:33:51.109955 | orchestrator | 2026-04-07 00:33:51.109975 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2026-04-07 00:33:51.109992 | orchestrator | Tuesday 07 April 2026 00:33:33 +0000 (0:00:00.655) 0:07:38.736 ********* 2026-04-07 00:33:51.110011 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:33:51.110112 | orchestrator | 2026-04-07 00:33:51.110130 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2026-04-07 00:33:51.110148 | orchestrator | Tuesday 07 April 2026 00:33:33 +0000 (0:00:00.777) 0:07:39.513 ********* 2026-04-07 00:33:51.110165 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.110183 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.110200 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.110217 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.110235 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.110305 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.110323 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.110341 | orchestrator | 2026-04-07 00:33:51.110359 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2026-04-07 00:33:51.110377 | orchestrator | Tuesday 07 April 2026 00:33:36 +0000 (0:00:02.396) 0:07:41.910 ********* 2026-04-07 00:33:51.110394 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.110411 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.110429 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.110445 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.110463 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.110482 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.110500 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.110516 | orchestrator | 2026-04-07 00:33:51.110533 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2026-04-07 00:33:51.110551 | orchestrator | Tuesday 07 April 2026 00:33:37 +0000 (0:00:01.232) 0:07:43.142 ********* 2026-04-07 00:33:51.110587 | orchestrator | ok: [testbed-manager] 2026-04-07 00:33:51.110605 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:33:51.110623 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:33:51.110642 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:33:51.110661 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:33:51.110677 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:33:51.110695 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:33:51.110713 | orchestrator | 2026-04-07 00:33:51.110732 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2026-04-07 00:33:51.110750 | orchestrator | Tuesday 07 April 2026 00:33:38 +0000 (0:00:00.875) 0:07:44.017 ********* 2026-04-07 00:33:51.110769 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110790 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110808 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110827 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110845 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110874 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110923 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-07 00:33:51.110940 | orchestrator | 2026-04-07 00:33:51.110957 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2026-04-07 00:33:51.110973 | orchestrator | Tuesday 07 April 2026 00:33:40 +0000 (0:00:01.715) 0:07:45.733 ********* 2026-04-07 00:33:51.110990 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:33:51.111008 | orchestrator | 2026-04-07 00:33:51.111024 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2026-04-07 00:33:51.111040 | orchestrator | Tuesday 07 April 2026 00:33:41 +0000 (0:00:00.956) 0:07:46.689 ********* 2026-04-07 00:33:51.111057 | orchestrator | changed: [testbed-manager] 2026-04-07 00:33:51.111072 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:33:51.111088 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:33:51.111103 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:33:51.111120 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:33:51.111135 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:33:51.111151 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:33:51.111168 | orchestrator | 2026-04-07 00:33:51.111204 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2026-04-07 00:34:21.949535 | orchestrator | Tuesday 07 April 2026 00:33:51 +0000 (0:00:09.945) 0:07:56.635 ********* 2026-04-07 00:34:21.949637 | orchestrator | ok: [testbed-manager] 2026-04-07 00:34:21.949651 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:34:21.949661 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:34:21.949671 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:34:21.949681 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:34:21.949691 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:34:21.949701 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:34:21.949711 | orchestrator | 2026-04-07 00:34:21.949722 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2026-04-07 00:34:21.949732 | orchestrator | Tuesday 07 April 2026 00:33:52 +0000 (0:00:01.677) 0:07:58.312 ********* 2026-04-07 00:34:21.949768 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:34:21.949779 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:34:21.949789 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:34:21.949799 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:34:21.949808 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:34:21.949818 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:34:21.949827 | orchestrator | 2026-04-07 00:34:21.949837 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2026-04-07 00:34:21.949927 | orchestrator | Tuesday 07 April 2026 00:33:54 +0000 (0:00:01.450) 0:07:59.763 ********* 2026-04-07 00:34:21.949945 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.949963 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.949980 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.949999 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.950080 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.950103 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.950123 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.950141 | orchestrator | 2026-04-07 00:34:21.950159 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2026-04-07 00:34:21.950183 | orchestrator | 2026-04-07 00:34:21.950205 | orchestrator | TASK [Include hardening role] ************************************************** 2026-04-07 00:34:21.950223 | orchestrator | Tuesday 07 April 2026 00:33:55 +0000 (0:00:01.303) 0:08:01.066 ********* 2026-04-07 00:34:21.950241 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:34:21.950262 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:34:21.950282 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:34:21.950300 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:34:21.950318 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:34:21.950345 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:34:21.950367 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:34:21.950386 | orchestrator | 2026-04-07 00:34:21.950406 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2026-04-07 00:34:21.950433 | orchestrator | 2026-04-07 00:34:21.950456 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2026-04-07 00:34:21.950475 | orchestrator | Tuesday 07 April 2026 00:33:56 +0000 (0:00:00.524) 0:08:01.591 ********* 2026-04-07 00:34:21.950494 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.950514 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.950533 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.950553 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.950572 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.950590 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.950607 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.950624 | orchestrator | 2026-04-07 00:34:21.950642 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2026-04-07 00:34:21.950661 | orchestrator | Tuesday 07 April 2026 00:33:57 +0000 (0:00:01.248) 0:08:02.839 ********* 2026-04-07 00:34:21.950681 | orchestrator | ok: [testbed-manager] 2026-04-07 00:34:21.950700 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:34:21.950711 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:34:21.950721 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:34:21.950730 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:34:21.950740 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:34:21.950749 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:34:21.950759 | orchestrator | 2026-04-07 00:34:21.950770 | orchestrator | TASK [Include auditd role] ***************************************************** 2026-04-07 00:34:21.950787 | orchestrator | Tuesday 07 April 2026 00:33:58 +0000 (0:00:01.559) 0:08:04.398 ********* 2026-04-07 00:34:21.950803 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:34:21.950818 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:34:21.950835 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:34:21.950881 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:34:21.950899 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:34:21.950937 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:34:21.950954 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:34:21.950993 | orchestrator | 2026-04-07 00:34:21.951018 | orchestrator | TASK [Include smartd role] ***************************************************** 2026-04-07 00:34:21.951029 | orchestrator | Tuesday 07 April 2026 00:33:59 +0000 (0:00:00.498) 0:08:04.897 ********* 2026-04-07 00:34:21.951039 | orchestrator | included: osism.services.smartd for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:34:21.951051 | orchestrator | 2026-04-07 00:34:21.951060 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2026-04-07 00:34:21.951070 | orchestrator | Tuesday 07 April 2026 00:34:00 +0000 (0:00:00.796) 0:08:05.693 ********* 2026-04-07 00:34:21.951082 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:34:21.951095 | orchestrator | 2026-04-07 00:34:21.951105 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2026-04-07 00:34:21.951115 | orchestrator | Tuesday 07 April 2026 00:34:01 +0000 (0:00:00.934) 0:08:06.628 ********* 2026-04-07 00:34:21.951124 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.951134 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.951144 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.951155 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.951172 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.951189 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.951205 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.951227 | orchestrator | 2026-04-07 00:34:21.951281 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2026-04-07 00:34:21.951299 | orchestrator | Tuesday 07 April 2026 00:34:11 +0000 (0:00:10.003) 0:08:16.631 ********* 2026-04-07 00:34:21.951319 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.951336 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.951366 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.951387 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.951406 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.951424 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.951447 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.951466 | orchestrator | 2026-04-07 00:34:21.951484 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2026-04-07 00:34:21.951501 | orchestrator | Tuesday 07 April 2026 00:34:11 +0000 (0:00:00.686) 0:08:17.318 ********* 2026-04-07 00:34:21.951517 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.951535 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.951554 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.951570 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.951587 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.951598 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.951607 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.951617 | orchestrator | 2026-04-07 00:34:21.951626 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2026-04-07 00:34:21.951636 | orchestrator | Tuesday 07 April 2026 00:34:13 +0000 (0:00:01.304) 0:08:18.622 ********* 2026-04-07 00:34:21.951645 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.951655 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.951664 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.951673 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.951683 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.951692 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.951702 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.951711 | orchestrator | 2026-04-07 00:34:21.951721 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2026-04-07 00:34:21.951746 | orchestrator | Tuesday 07 April 2026 00:34:14 +0000 (0:00:01.857) 0:08:20.480 ********* 2026-04-07 00:34:21.951756 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.951765 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.951775 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.951784 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.951794 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.951803 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.951818 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.951834 | orchestrator | 2026-04-07 00:34:21.951878 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2026-04-07 00:34:21.951894 | orchestrator | Tuesday 07 April 2026 00:34:16 +0000 (0:00:01.315) 0:08:21.795 ********* 2026-04-07 00:34:21.951912 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.951929 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.951946 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.951958 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.951968 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.951978 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.951988 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.951997 | orchestrator | 2026-04-07 00:34:21.952007 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2026-04-07 00:34:21.952017 | orchestrator | 2026-04-07 00:34:21.952027 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2026-04-07 00:34:21.952037 | orchestrator | Tuesday 07 April 2026 00:34:17 +0000 (0:00:01.015) 0:08:22.810 ********* 2026-04-07 00:34:21.952047 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:34:21.952057 | orchestrator | 2026-04-07 00:34:21.952067 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2026-04-07 00:34:21.952077 | orchestrator | Tuesday 07 April 2026 00:34:18 +0000 (0:00:00.786) 0:08:23.596 ********* 2026-04-07 00:34:21.952086 | orchestrator | ok: [testbed-manager] 2026-04-07 00:34:21.952096 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:34:21.952105 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:34:21.952115 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:34:21.952125 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:34:21.952134 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:34:21.952144 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:34:21.952154 | orchestrator | 2026-04-07 00:34:21.952164 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2026-04-07 00:34:21.952174 | orchestrator | Tuesday 07 April 2026 00:34:18 +0000 (0:00:00.755) 0:08:24.351 ********* 2026-04-07 00:34:21.952183 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:21.952193 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:21.952203 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:21.952212 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:21.952222 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:21.952232 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:21.952241 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:21.952251 | orchestrator | 2026-04-07 00:34:21.952261 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2026-04-07 00:34:21.952270 | orchestrator | Tuesday 07 April 2026 00:34:20 +0000 (0:00:01.279) 0:08:25.631 ********* 2026-04-07 00:34:21.952280 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:34:21.952290 | orchestrator | 2026-04-07 00:34:21.952300 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2026-04-07 00:34:21.952309 | orchestrator | Tuesday 07 April 2026 00:34:20 +0000 (0:00:00.860) 0:08:26.492 ********* 2026-04-07 00:34:21.952319 | orchestrator | ok: [testbed-manager] 2026-04-07 00:34:21.952329 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:34:21.952347 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:34:21.952356 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:34:21.952366 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:34:21.952376 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:34:21.952385 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:34:21.952395 | orchestrator | 2026-04-07 00:34:21.952415 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2026-04-07 00:34:23.499200 | orchestrator | Tuesday 07 April 2026 00:34:21 +0000 (0:00:00.983) 0:08:27.475 ********* 2026-04-07 00:34:23.499323 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:23.499345 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:23.499361 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:23.499375 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:23.499389 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:23.499403 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:23.499419 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:23.499434 | orchestrator | 2026-04-07 00:34:23.499450 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:34:23.499467 | orchestrator | testbed-manager : ok=168  changed=40  unreachable=0 failed=0 skipped=42  rescued=0 ignored=0 2026-04-07 00:34:23.499484 | orchestrator | testbed-node-0 : ok=177  changed=69  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-07 00:34:23.499500 | orchestrator | testbed-node-1 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-04-07 00:34:23.499542 | orchestrator | testbed-node-2 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-04-07 00:34:23.499560 | orchestrator | testbed-node-3 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-07 00:34:23.499577 | orchestrator | testbed-node-4 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-07 00:34:23.499594 | orchestrator | testbed-node-5 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-07 00:34:23.499610 | orchestrator | 2026-04-07 00:34:23.499626 | orchestrator | 2026-04-07 00:34:23.499643 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:34:23.499660 | orchestrator | Tuesday 07 April 2026 00:34:23 +0000 (0:00:01.259) 0:08:28.735 ********* 2026-04-07 00:34:23.499676 | orchestrator | =============================================================================== 2026-04-07 00:34:23.499693 | orchestrator | osism.commons.packages : Install required packages --------------------- 77.43s 2026-04-07 00:34:23.499710 | orchestrator | osism.commons.packages : Download required packages -------------------- 39.22s 2026-04-07 00:34:23.499725 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 34.54s 2026-04-07 00:34:23.499741 | orchestrator | osism.commons.repository : Update package cache ------------------------ 18.36s 2026-04-07 00:34:23.499757 | orchestrator | osism.services.docker : Install docker package ------------------------- 11.84s 2026-04-07 00:34:23.499772 | orchestrator | osism.services.docker : Install containerd package --------------------- 11.81s 2026-04-07 00:34:23.499788 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 10.83s 2026-04-07 00:34:23.499803 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 10.83s 2026-04-07 00:34:23.499820 | orchestrator | osism.services.docker : Add repository --------------------------------- 10.22s 2026-04-07 00:34:23.499836 | orchestrator | osism.commons.cleanup : Remove cloudinit package ----------------------- 10.07s 2026-04-07 00:34:23.499882 | orchestrator | osism.services.smartd : Install smartmontools package ------------------ 10.00s 2026-04-07 00:34:23.499932 | orchestrator | osism.services.docker : Install docker-cli package --------------------- 10.00s 2026-04-07 00:34:23.499949 | orchestrator | osism.services.lldpd : Install lldpd package ---------------------------- 9.95s 2026-04-07 00:34:23.499964 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 9.39s 2026-04-07 00:34:23.499989 | orchestrator | osism.services.rng : Install rng package -------------------------------- 9.20s 2026-04-07 00:34:23.500005 | orchestrator | osism.commons.docker_compose : Install docker-compose-plugin package ---- 8.74s 2026-04-07 00:34:23.500022 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 7.82s 2026-04-07 00:34:23.500038 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 7.62s 2026-04-07 00:34:23.500054 | orchestrator | osism.commons.sysctl : Set sysctl parameters on rabbitmq ---------------- 7.01s 2026-04-07 00:34:23.500070 | orchestrator | osism.commons.cleanup : Populate service facts -------------------------- 5.66s 2026-04-07 00:34:23.686937 | orchestrator | + osism apply fail2ban 2026-04-07 00:34:35.433789 | orchestrator | 2026-04-07 00:34:35 | INFO  | Prepare task for execution of fail2ban. 2026-04-07 00:34:35.530619 | orchestrator | 2026-04-07 00:34:35 | INFO  | Task 50e1782f-155f-4dcc-8cb3-0e4ec190edbd (fail2ban) was prepared for execution. 2026-04-07 00:34:35.530714 | orchestrator | 2026-04-07 00:34:35 | INFO  | It takes a moment until task 50e1782f-155f-4dcc-8cb3-0e4ec190edbd (fail2ban) has been started and output is visible here. 2026-04-07 00:34:56.454323 | orchestrator | 2026-04-07 00:34:56.454462 | orchestrator | PLAY [Apply role fail2ban] ***************************************************** 2026-04-07 00:34:56.454480 | orchestrator | 2026-04-07 00:34:56.454493 | orchestrator | TASK [osism.services.fail2ban : Include distribution specific install tasks] *** 2026-04-07 00:34:56.454505 | orchestrator | Tuesday 07 April 2026 00:34:38 +0000 (0:00:00.285) 0:00:00.285 ********* 2026-04-07 00:34:56.454517 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/fail2ban/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:34:56.454547 | orchestrator | 2026-04-07 00:34:56.454570 | orchestrator | TASK [osism.services.fail2ban : Install fail2ban package] ********************** 2026-04-07 00:34:56.454582 | orchestrator | Tuesday 07 April 2026 00:34:39 +0000 (0:00:01.055) 0:00:01.341 ********* 2026-04-07 00:34:56.454593 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:56.454605 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:56.454617 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:56.454628 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:56.454638 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:56.454649 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:56.454660 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:56.454671 | orchestrator | 2026-04-07 00:34:56.454681 | orchestrator | TASK [osism.services.fail2ban : Copy configuration files] ********************** 2026-04-07 00:34:56.454693 | orchestrator | Tuesday 07 April 2026 00:34:51 +0000 (0:00:11.804) 0:00:13.145 ********* 2026-04-07 00:34:56.454703 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:56.454714 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:56.454725 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:56.454736 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:56.454746 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:56.454757 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:56.454768 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:56.454778 | orchestrator | 2026-04-07 00:34:56.454789 | orchestrator | TASK [osism.services.fail2ban : Manage fail2ban service] *********************** 2026-04-07 00:34:56.454800 | orchestrator | Tuesday 07 April 2026 00:34:53 +0000 (0:00:01.592) 0:00:14.738 ********* 2026-04-07 00:34:56.454856 | orchestrator | ok: [testbed-manager] 2026-04-07 00:34:56.454870 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:34:56.454888 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:34:56.454909 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:34:56.454958 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:34:56.454976 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:34:56.454993 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:34:56.455009 | orchestrator | 2026-04-07 00:34:56.455027 | orchestrator | TASK [osism.services.fail2ban : Reload fail2ban configuration] ***************** 2026-04-07 00:34:56.455045 | orchestrator | Tuesday 07 April 2026 00:34:54 +0000 (0:00:01.381) 0:00:16.120 ********* 2026-04-07 00:34:56.455061 | orchestrator | changed: [testbed-manager] 2026-04-07 00:34:56.455079 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:34:56.455098 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:34:56.455115 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:34:56.455133 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:34:56.455151 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:34:56.455170 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:34:56.455187 | orchestrator | 2026-04-07 00:34:56.455206 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:34:56.455227 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455246 | orchestrator | testbed-node-0 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455263 | orchestrator | testbed-node-1 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455281 | orchestrator | testbed-node-2 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455301 | orchestrator | testbed-node-3 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455320 | orchestrator | testbed-node-4 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455358 | orchestrator | testbed-node-5 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:34:56.455372 | orchestrator | 2026-04-07 00:34:56.455383 | orchestrator | 2026-04-07 00:34:56.455394 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:34:56.455405 | orchestrator | Tuesday 07 April 2026 00:34:56 +0000 (0:00:01.578) 0:00:17.698 ********* 2026-04-07 00:34:56.455416 | orchestrator | =============================================================================== 2026-04-07 00:34:56.455427 | orchestrator | osism.services.fail2ban : Install fail2ban package --------------------- 11.80s 2026-04-07 00:34:56.455438 | orchestrator | osism.services.fail2ban : Copy configuration files ---------------------- 1.59s 2026-04-07 00:34:56.455449 | orchestrator | osism.services.fail2ban : Reload fail2ban configuration ----------------- 1.58s 2026-04-07 00:34:56.455460 | orchestrator | osism.services.fail2ban : Manage fail2ban service ----------------------- 1.38s 2026-04-07 00:34:56.455471 | orchestrator | osism.services.fail2ban : Include distribution specific install tasks --- 1.06s 2026-04-07 00:34:56.626620 | orchestrator | + osism apply network 2026-04-07 00:35:07.878869 | orchestrator | 2026-04-07 00:35:07 | INFO  | Prepare task for execution of network. 2026-04-07 00:35:07.963092 | orchestrator | 2026-04-07 00:35:07 | INFO  | Task ee0f485e-2d74-4929-abf8-9fb0a0d3d843 (network) was prepared for execution. 2026-04-07 00:35:07.963179 | orchestrator | 2026-04-07 00:35:07 | INFO  | It takes a moment until task ee0f485e-2d74-4929-abf8-9fb0a0d3d843 (network) has been started and output is visible here. 2026-04-07 00:35:35.535341 | orchestrator | 2026-04-07 00:35:35.535472 | orchestrator | PLAY [Apply role network] ****************************************************** 2026-04-07 00:35:35.535502 | orchestrator | 2026-04-07 00:35:35.535522 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2026-04-07 00:35:35.535578 | orchestrator | Tuesday 07 April 2026 00:35:10 +0000 (0:00:00.287) 0:00:00.287 ********* 2026-04-07 00:35:35.535596 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.535614 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.535632 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.535650 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.535668 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.535688 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.535706 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.535725 | orchestrator | 2026-04-07 00:35:35.535744 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2026-04-07 00:35:35.535761 | orchestrator | Tuesday 07 April 2026 00:35:11 +0000 (0:00:00.583) 0:00:00.870 ********* 2026-04-07 00:35:35.535820 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:35:35.535842 | orchestrator | 2026-04-07 00:35:35.535860 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2026-04-07 00:35:35.535879 | orchestrator | Tuesday 07 April 2026 00:35:12 +0000 (0:00:01.024) 0:00:01.895 ********* 2026-04-07 00:35:35.535899 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.535918 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.535936 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.535955 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.535968 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.535987 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.536003 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.536018 | orchestrator | 2026-04-07 00:35:35.536035 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2026-04-07 00:35:35.536051 | orchestrator | Tuesday 07 April 2026 00:35:15 +0000 (0:00:02.510) 0:00:04.406 ********* 2026-04-07 00:35:35.536067 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.536083 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.536099 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.536115 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.536132 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.536149 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.536167 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.536186 | orchestrator | 2026-04-07 00:35:35.536203 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2026-04-07 00:35:35.536222 | orchestrator | Tuesday 07 April 2026 00:35:16 +0000 (0:00:01.717) 0:00:06.123 ********* 2026-04-07 00:35:35.536233 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2026-04-07 00:35:35.536245 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2026-04-07 00:35:35.536256 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2026-04-07 00:35:35.536267 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2026-04-07 00:35:35.536277 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2026-04-07 00:35:35.536288 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2026-04-07 00:35:35.536299 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2026-04-07 00:35:35.536310 | orchestrator | 2026-04-07 00:35:35.536320 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2026-04-07 00:35:35.536331 | orchestrator | Tuesday 07 April 2026 00:35:17 +0000 (0:00:01.164) 0:00:07.287 ********* 2026-04-07 00:35:35.536342 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:35:35.536354 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-07 00:35:35.536364 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-07 00:35:35.536375 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-04-07 00:35:35.536386 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 00:35:35.536396 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-04-07 00:35:35.536407 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-04-07 00:35:35.536418 | orchestrator | 2026-04-07 00:35:35.536429 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2026-04-07 00:35:35.536454 | orchestrator | Tuesday 07 April 2026 00:35:21 +0000 (0:00:03.626) 0:00:10.914 ********* 2026-04-07 00:35:35.536465 | orchestrator | changed: [testbed-manager] 2026-04-07 00:35:35.536476 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:35:35.536502 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:35:35.536513 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:35:35.536523 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:35:35.536534 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:35:35.536544 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:35:35.536555 | orchestrator | 2026-04-07 00:35:35.536566 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2026-04-07 00:35:35.536577 | orchestrator | Tuesday 07 April 2026 00:35:23 +0000 (0:00:01.656) 0:00:12.570 ********* 2026-04-07 00:35:35.536587 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:35:35.536598 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 00:35:35.536609 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-07 00:35:35.536619 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-07 00:35:35.536630 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-04-07 00:35:35.536640 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-04-07 00:35:35.536651 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-04-07 00:35:35.536662 | orchestrator | 2026-04-07 00:35:35.536673 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2026-04-07 00:35:35.536684 | orchestrator | Tuesday 07 April 2026 00:35:25 +0000 (0:00:01.991) 0:00:14.562 ********* 2026-04-07 00:35:35.536695 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.536705 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.536716 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.536727 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.536737 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.536748 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.536758 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.536795 | orchestrator | 2026-04-07 00:35:35.536814 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2026-04-07 00:35:35.536861 | orchestrator | Tuesday 07 April 2026 00:35:26 +0000 (0:00:00.980) 0:00:15.542 ********* 2026-04-07 00:35:35.536881 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:35:35.536908 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:35:35.536929 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:35:35.536946 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:35:35.536963 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:35:35.536981 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:35:35.536998 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:35:35.537016 | orchestrator | 2026-04-07 00:35:35.537034 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2026-04-07 00:35:35.537052 | orchestrator | Tuesday 07 April 2026 00:35:26 +0000 (0:00:00.763) 0:00:16.306 ********* 2026-04-07 00:35:35.537071 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.537089 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.537107 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.537118 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.537129 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.537140 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.537150 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.537161 | orchestrator | 2026-04-07 00:35:35.537172 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2026-04-07 00:35:35.537183 | orchestrator | Tuesday 07 April 2026 00:35:29 +0000 (0:00:02.173) 0:00:18.480 ********* 2026-04-07 00:35:35.537194 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:35:35.537205 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:35:35.537215 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:35:35.537226 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:35:35.537237 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:35:35.537247 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:35:35.537270 | orchestrator | changed: [testbed-manager] => (item={'src': '/opt/configuration/network/iptables.sh', 'dest': 'routable.d/iptables.sh'}) 2026-04-07 00:35:35.537282 | orchestrator | 2026-04-07 00:35:35.537293 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2026-04-07 00:35:35.537304 | orchestrator | Tuesday 07 April 2026 00:35:29 +0000 (0:00:00.851) 0:00:19.332 ********* 2026-04-07 00:35:35.537315 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.537325 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:35:35.537336 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:35:35.537347 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:35:35.537357 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:35:35.537368 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:35:35.537378 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:35:35.537389 | orchestrator | 2026-04-07 00:35:35.537400 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2026-04-07 00:35:35.537410 | orchestrator | Tuesday 07 April 2026 00:35:31 +0000 (0:00:01.447) 0:00:20.780 ********* 2026-04-07 00:35:35.537422 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:35:35.537435 | orchestrator | 2026-04-07 00:35:35.537446 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2026-04-07 00:35:35.537456 | orchestrator | Tuesday 07 April 2026 00:35:32 +0000 (0:00:01.160) 0:00:21.940 ********* 2026-04-07 00:35:35.537467 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.537478 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.537488 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.537499 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.537509 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.537520 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.537530 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.537541 | orchestrator | 2026-04-07 00:35:35.537552 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2026-04-07 00:35:35.537563 | orchestrator | Tuesday 07 April 2026 00:35:33 +0000 (0:00:01.133) 0:00:23.074 ********* 2026-04-07 00:35:35.537574 | orchestrator | ok: [testbed-manager] 2026-04-07 00:35:35.537584 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:35:35.537595 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:35:35.537606 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:35:35.537616 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:35:35.537627 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:35:35.537637 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:35:35.537648 | orchestrator | 2026-04-07 00:35:35.537659 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2026-04-07 00:35:35.537677 | orchestrator | Tuesday 07 April 2026 00:35:34 +0000 (0:00:00.744) 0:00:23.818 ********* 2026-04-07 00:35:35.537688 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537699 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537710 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537720 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537731 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537741 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537752 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537763 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537863 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537875 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2026-04-07 00:35:35.537894 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537905 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537916 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537927 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-07 00:35:35.537937 | orchestrator | 2026-04-07 00:35:35.537960 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2026-04-07 00:35:50.097034 | orchestrator | Tuesday 07 April 2026 00:35:35 +0000 (0:00:01.059) 0:00:24.878 ********* 2026-04-07 00:35:50.097155 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:35:50.097182 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:35:50.097201 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:35:50.097219 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:35:50.097238 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:35:50.097256 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:35:50.097274 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:35:50.097293 | orchestrator | 2026-04-07 00:35:50.097311 | orchestrator | TASK [osism.commons.network : Include vxlan interfaces] ************************ 2026-04-07 00:35:50.097330 | orchestrator | Tuesday 07 April 2026 00:35:36 +0000 (0:00:00.745) 0:00:25.623 ********* 2026-04-07 00:35:50.097350 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/vxlan-interfaces.yml for testbed-manager, testbed-node-1, testbed-node-0, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:35:50.097370 | orchestrator | 2026-04-07 00:35:50.097389 | orchestrator | TASK [osism.commons.network : Create systemd networkd netdev files] ************ 2026-04-07 00:35:50.097407 | orchestrator | Tuesday 07 April 2026 00:35:40 +0000 (0:00:03.912) 0:00:29.535 ********* 2026-04-07 00:35:50.097427 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.112.5/20']}}) 2026-04-07 00:35:50.097448 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097467 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097486 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097504 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.128.5/20']}}) 2026-04-07 00:35:50.097523 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097541 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097575 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097624 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.11/20']}}) 2026-04-07 00:35:50.097651 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.12/20']}}) 2026-04-07 00:35:50.097671 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.10/20']}}) 2026-04-07 00:35:50.097712 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.13/20']}}) 2026-04-07 00:35:50.097733 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.14/20']}}) 2026-04-07 00:35:50.097775 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': ['192.168.128.15/20']}}) 2026-04-07 00:35:50.097796 | orchestrator | 2026-04-07 00:35:50.097815 | orchestrator | TASK [osism.commons.network : Create systemd networkd network files] *********** 2026-04-07 00:35:50.097834 | orchestrator | Tuesday 07 April 2026 00:35:45 +0000 (0:00:04.988) 0:00:34.524 ********* 2026-04-07 00:35:50.097853 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097873 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097891 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.112.5/20']}}) 2026-04-07 00:35:50.097911 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097930 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097948 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.097967 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.128.5/20']}}) 2026-04-07 00:35:50.097997 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.13/20']}}) 2026-04-07 00:35:50.098072 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.11/20']}}) 2026-04-07 00:35:50.098095 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-07 00:35:50.098113 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.10/20']}}) 2026-04-07 00:35:50.098132 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': ['192.168.128.15/20']}}) 2026-04-07 00:35:50.098162 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.12/20']}}) 2026-04-07 00:36:02.583739 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.14/20']}}) 2026-04-07 00:36:02.583906 | orchestrator | 2026-04-07 00:36:02.583924 | orchestrator | TASK [osism.commons.network : Include networkd cleanup tasks] ****************** 2026-04-07 00:36:02.583938 | orchestrator | Tuesday 07 April 2026 00:35:50 +0000 (0:00:05.164) 0:00:39.689 ********* 2026-04-07 00:36:02.583950 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-networkd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:36:02.583963 | orchestrator | 2026-04-07 00:36:02.583974 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2026-04-07 00:36:02.583985 | orchestrator | Tuesday 07 April 2026 00:35:51 +0000 (0:00:01.147) 0:00:40.836 ********* 2026-04-07 00:36:02.583997 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:02.584009 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:36:02.584020 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:36:02.584031 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:36:02.584042 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:36:02.584053 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:36:02.584063 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:36:02.584074 | orchestrator | 2026-04-07 00:36:02.584085 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2026-04-07 00:36:02.584096 | orchestrator | Tuesday 07 April 2026 00:35:52 +0000 (0:00:01.108) 0:00:41.945 ********* 2026-04-07 00:36:02.584107 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584118 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584129 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584250 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584267 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:36:02.584281 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584294 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584306 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584319 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584332 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584345 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584375 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584389 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584402 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:36:02.584416 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584429 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584442 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584454 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584469 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:36:02.584486 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584500 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584512 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584526 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584538 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:36:02.584551 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584564 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584576 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584589 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584602 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:36:02.584615 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:36:02.584627 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-07 00:36:02.584638 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-07 00:36:02.584649 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-07 00:36:02.584659 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-07 00:36:02.584670 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:36:02.584681 | orchestrator | 2026-04-07 00:36:02.584692 | orchestrator | TASK [osism.commons.network : Include network extra init] ********************** 2026-04-07 00:36:02.584722 | orchestrator | Tuesday 07 April 2026 00:35:53 +0000 (0:00:00.676) 0:00:42.622 ********* 2026-04-07 00:36:02.584735 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/network-extra-init.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:36:02.584767 | orchestrator | 2026-04-07 00:36:02.584779 | orchestrator | TASK [osism.commons.network : Deploy network-extra-init script] **************** 2026-04-07 00:36:02.584790 | orchestrator | Tuesday 07 April 2026 00:35:54 +0000 (0:00:01.090) 0:00:43.712 ********* 2026-04-07 00:36:02.584811 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:36:02.584827 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:36:02.584848 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:36:02.584863 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:36:02.584880 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:36:02.584908 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:36:02.584927 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:36:02.584946 | orchestrator | 2026-04-07 00:36:02.584965 | orchestrator | TASK [osism.commons.network : Deploy network-extra-init systemd service] ******* 2026-04-07 00:36:02.584983 | orchestrator | Tuesday 07 April 2026 00:35:55 +0000 (0:00:00.703) 0:00:44.416 ********* 2026-04-07 00:36:02.585002 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:36:02.585021 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:36:02.585040 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:36:02.585058 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:36:02.585076 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:36:02.585096 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:36:02.585116 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:36:02.585136 | orchestrator | 2026-04-07 00:36:02.585156 | orchestrator | TASK [osism.commons.network : Enable and start network-extra-init service] ***** 2026-04-07 00:36:02.585176 | orchestrator | Tuesday 07 April 2026 00:35:55 +0000 (0:00:00.587) 0:00:45.004 ********* 2026-04-07 00:36:02.585195 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:36:02.585216 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:36:02.585237 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:36:02.585278 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:36:02.585298 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:36:02.585317 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:36:02.585336 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:36:02.585355 | orchestrator | 2026-04-07 00:36:02.585374 | orchestrator | TASK [osism.commons.network : Disable and stop network-extra-init service] ***** 2026-04-07 00:36:02.585394 | orchestrator | Tuesday 07 April 2026 00:35:56 +0000 (0:00:00.730) 0:00:45.734 ********* 2026-04-07 00:36:02.585411 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:02.585431 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:36:02.585449 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:36:02.585468 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:36:02.585488 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:36:02.585506 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:36:02.585524 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:36:02.585542 | orchestrator | 2026-04-07 00:36:02.585560 | orchestrator | TASK [osism.commons.network : Remove network-extra-init systemd service] ******* 2026-04-07 00:36:02.585578 | orchestrator | Tuesday 07 April 2026 00:35:58 +0000 (0:00:01.625) 0:00:47.359 ********* 2026-04-07 00:36:02.585597 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:02.585616 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:36:02.585634 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:36:02.585652 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:36:02.585664 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:36:02.585675 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:36:02.585685 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:36:02.585696 | orchestrator | 2026-04-07 00:36:02.585707 | orchestrator | TASK [osism.commons.network : Remove network-extra-init script] **************** 2026-04-07 00:36:02.585718 | orchestrator | Tuesday 07 April 2026 00:35:59 +0000 (0:00:01.201) 0:00:48.561 ********* 2026-04-07 00:36:02.585729 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:02.585739 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:36:02.585806 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:36:02.585821 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:36:02.585832 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:36:02.585843 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:36:02.585853 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:36:02.585864 | orchestrator | 2026-04-07 00:36:02.585882 | orchestrator | RUNNING HANDLER [osism.commons.network : Reload systemd-networkd] ************** 2026-04-07 00:36:02.585912 | orchestrator | Tuesday 07 April 2026 00:36:01 +0000 (0:00:02.138) 0:00:50.699 ********* 2026-04-07 00:36:02.585929 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:36:02.585947 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:36:02.585964 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:36:02.585980 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:36:02.585997 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:36:02.586014 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:36:02.586159 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:36:02.586171 | orchestrator | 2026-04-07 00:36:02.586183 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2026-04-07 00:36:02.586194 | orchestrator | Tuesday 07 April 2026 00:36:01 +0000 (0:00:00.594) 0:00:51.293 ********* 2026-04-07 00:36:02.586204 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:36:02.586215 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:36:02.586226 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:36:02.586236 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:36:02.586247 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:36:02.586257 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:36:02.586268 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:36:02.586279 | orchestrator | 2026-04-07 00:36:02.586290 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:36:02.586301 | orchestrator | testbed-manager : ok=25  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2026-04-07 00:36:02.586314 | orchestrator | testbed-node-0 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-07 00:36:02.586341 | orchestrator | testbed-node-1 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-07 00:36:02.821318 | orchestrator | testbed-node-2 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-07 00:36:02.821423 | orchestrator | testbed-node-3 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-07 00:36:02.821438 | orchestrator | testbed-node-4 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-07 00:36:02.821450 | orchestrator | testbed-node-5 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-07 00:36:02.821462 | orchestrator | 2026-04-07 00:36:02.821474 | orchestrator | 2026-04-07 00:36:02.821486 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:36:02.821499 | orchestrator | Tuesday 07 April 2026 00:36:02 +0000 (0:00:00.630) 0:00:51.923 ********* 2026-04-07 00:36:02.821510 | orchestrator | =============================================================================== 2026-04-07 00:36:02.821521 | orchestrator | osism.commons.network : Create systemd networkd network files ----------- 5.16s 2026-04-07 00:36:02.821531 | orchestrator | osism.commons.network : Create systemd networkd netdev files ------------ 4.99s 2026-04-07 00:36:02.821543 | orchestrator | osism.commons.network : Include vxlan interfaces ------------------------ 3.91s 2026-04-07 00:36:02.821553 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 3.63s 2026-04-07 00:36:02.821564 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.51s 2026-04-07 00:36:02.821575 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.17s 2026-04-07 00:36:02.821586 | orchestrator | osism.commons.network : Remove network-extra-init script ---------------- 2.14s 2026-04-07 00:36:02.821597 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 1.99s 2026-04-07 00:36:02.821634 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 1.72s 2026-04-07 00:36:02.821646 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.66s 2026-04-07 00:36:02.821656 | orchestrator | osism.commons.network : Disable and stop network-extra-init service ----- 1.63s 2026-04-07 00:36:02.821667 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.45s 2026-04-07 00:36:02.821678 | orchestrator | osism.commons.network : Remove network-extra-init systemd service ------- 1.20s 2026-04-07 00:36:02.821689 | orchestrator | osism.commons.network : Create required directories --------------------- 1.16s 2026-04-07 00:36:02.821700 | orchestrator | osism.commons.network : Include cleanup tasks --------------------------- 1.16s 2026-04-07 00:36:02.821711 | orchestrator | osism.commons.network : Include networkd cleanup tasks ------------------ 1.15s 2026-04-07 00:36:02.821722 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.13s 2026-04-07 00:36:02.821732 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.11s 2026-04-07 00:36:02.821792 | orchestrator | osism.commons.network : Include network extra init ---------------------- 1.09s 2026-04-07 00:36:02.821805 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.06s 2026-04-07 00:36:02.992649 | orchestrator | + osism apply wireguard 2026-04-07 00:36:14.322756 | orchestrator | 2026-04-07 00:36:14 | INFO  | Prepare task for execution of wireguard. 2026-04-07 00:36:14.385551 | orchestrator | 2026-04-07 00:36:14 | INFO  | Task 59473f29-e462-4104-948b-73ce9982c274 (wireguard) was prepared for execution. 2026-04-07 00:36:14.385656 | orchestrator | 2026-04-07 00:36:14 | INFO  | It takes a moment until task 59473f29-e462-4104-948b-73ce9982c274 (wireguard) has been started and output is visible here. 2026-04-07 00:36:31.499597 | orchestrator | 2026-04-07 00:36:31.499671 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2026-04-07 00:36:31.499678 | orchestrator | 2026-04-07 00:36:31.499683 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2026-04-07 00:36:31.499687 | orchestrator | Tuesday 07 April 2026 00:36:17 +0000 (0:00:00.212) 0:00:00.212 ********* 2026-04-07 00:36:31.499692 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:31.499697 | orchestrator | 2026-04-07 00:36:31.499701 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2026-04-07 00:36:31.499706 | orchestrator | Tuesday 07 April 2026 00:36:18 +0000 (0:00:01.501) 0:00:01.713 ********* 2026-04-07 00:36:31.499710 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499744 | orchestrator | 2026-04-07 00:36:31.499748 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2026-04-07 00:36:31.499752 | orchestrator | Tuesday 07 April 2026 00:36:24 +0000 (0:00:05.447) 0:00:07.161 ********* 2026-04-07 00:36:31.499756 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499760 | orchestrator | 2026-04-07 00:36:31.499764 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2026-04-07 00:36:31.499768 | orchestrator | Tuesday 07 April 2026 00:36:24 +0000 (0:00:00.564) 0:00:07.726 ********* 2026-04-07 00:36:31.499772 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499776 | orchestrator | 2026-04-07 00:36:31.499780 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2026-04-07 00:36:31.499783 | orchestrator | Tuesday 07 April 2026 00:36:25 +0000 (0:00:00.436) 0:00:08.163 ********* 2026-04-07 00:36:31.499788 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:31.499792 | orchestrator | 2026-04-07 00:36:31.499796 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2026-04-07 00:36:31.499800 | orchestrator | Tuesday 07 April 2026 00:36:25 +0000 (0:00:00.541) 0:00:08.704 ********* 2026-04-07 00:36:31.499804 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:31.499808 | orchestrator | 2026-04-07 00:36:31.499811 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2026-04-07 00:36:31.499815 | orchestrator | Tuesday 07 April 2026 00:36:26 +0000 (0:00:00.392) 0:00:09.097 ********* 2026-04-07 00:36:31.499837 | orchestrator | ok: [testbed-manager] 2026-04-07 00:36:31.499841 | orchestrator | 2026-04-07 00:36:31.499844 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2026-04-07 00:36:31.499848 | orchestrator | Tuesday 07 April 2026 00:36:26 +0000 (0:00:00.410) 0:00:09.508 ********* 2026-04-07 00:36:31.499852 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499856 | orchestrator | 2026-04-07 00:36:31.499859 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2026-04-07 00:36:31.499863 | orchestrator | Tuesday 07 April 2026 00:36:27 +0000 (0:00:01.124) 0:00:10.633 ********* 2026-04-07 00:36:31.499867 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-07 00:36:31.499871 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499875 | orchestrator | 2026-04-07 00:36:31.499878 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2026-04-07 00:36:31.499882 | orchestrator | Tuesday 07 April 2026 00:36:28 +0000 (0:00:00.895) 0:00:11.528 ********* 2026-04-07 00:36:31.499886 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499889 | orchestrator | 2026-04-07 00:36:31.499893 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2026-04-07 00:36:31.499897 | orchestrator | Tuesday 07 April 2026 00:36:30 +0000 (0:00:01.905) 0:00:13.434 ********* 2026-04-07 00:36:31.499901 | orchestrator | changed: [testbed-manager] 2026-04-07 00:36:31.499904 | orchestrator | 2026-04-07 00:36:31.499908 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:36:31.499912 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:36:31.499917 | orchestrator | 2026-04-07 00:36:31.499921 | orchestrator | 2026-04-07 00:36:31.499925 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:36:31.499928 | orchestrator | Tuesday 07 April 2026 00:36:31 +0000 (0:00:00.867) 0:00:14.302 ********* 2026-04-07 00:36:31.499932 | orchestrator | =============================================================================== 2026-04-07 00:36:31.499936 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 5.45s 2026-04-07 00:36:31.499940 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.91s 2026-04-07 00:36:31.499944 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.50s 2026-04-07 00:36:31.499947 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.13s 2026-04-07 00:36:31.499951 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 0.90s 2026-04-07 00:36:31.499955 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.87s 2026-04-07 00:36:31.499959 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.56s 2026-04-07 00:36:31.499962 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.54s 2026-04-07 00:36:31.499966 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.44s 2026-04-07 00:36:31.499970 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.41s 2026-04-07 00:36:31.499973 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.39s 2026-04-07 00:36:31.671369 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2026-04-07 00:36:31.707342 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2026-04-07 00:36:31.707479 | orchestrator | Dload Upload Total Spent Left Speed 2026-04-07 00:36:31.788812 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 15 100 15 0 0 183 0 --:--:-- --:--:-- --:--:-- 185 2026-04-07 00:36:31.801101 | orchestrator | + osism apply --environment custom workarounds 2026-04-07 00:36:33.023957 | orchestrator | 2026-04-07 00:36:33 | INFO  | Trying to run play workarounds in environment custom 2026-04-07 00:36:43.093567 | orchestrator | 2026-04-07 00:36:43 | INFO  | Prepare task for execution of workarounds. 2026-04-07 00:36:43.173545 | orchestrator | 2026-04-07 00:36:43 | INFO  | Task c95929f7-755f-4c17-aaa0-ae077bd88aa4 (workarounds) was prepared for execution. 2026-04-07 00:36:43.173661 | orchestrator | 2026-04-07 00:36:43 | INFO  | It takes a moment until task c95929f7-755f-4c17-aaa0-ae077bd88aa4 (workarounds) has been started and output is visible here. 2026-04-07 00:37:07.143060 | orchestrator | 2026-04-07 00:37:07.143187 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:37:07.143209 | orchestrator | 2026-04-07 00:37:07.143225 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2026-04-07 00:37:07.143241 | orchestrator | Tuesday 07 April 2026 00:36:46 +0000 (0:00:00.138) 0:00:00.138 ********* 2026-04-07 00:37:07.143256 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143271 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143285 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143301 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143316 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143332 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143347 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2026-04-07 00:37:07.143362 | orchestrator | 2026-04-07 00:37:07.143378 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2026-04-07 00:37:07.143394 | orchestrator | 2026-04-07 00:37:07.143411 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2026-04-07 00:37:07.143427 | orchestrator | Tuesday 07 April 2026 00:36:46 +0000 (0:00:00.608) 0:00:00.746 ********* 2026-04-07 00:37:07.143444 | orchestrator | ok: [testbed-manager] 2026-04-07 00:37:07.143463 | orchestrator | 2026-04-07 00:37:07.143480 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2026-04-07 00:37:07.143496 | orchestrator | 2026-04-07 00:37:07.143513 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2026-04-07 00:37:07.143546 | orchestrator | Tuesday 07 April 2026 00:36:49 +0000 (0:00:02.430) 0:00:03.177 ********* 2026-04-07 00:37:07.143557 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:37:07.143567 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:37:07.143577 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:37:07.143587 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:37:07.143598 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:37:07.143607 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:37:07.143617 | orchestrator | 2026-04-07 00:37:07.143627 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2026-04-07 00:37:07.143637 | orchestrator | 2026-04-07 00:37:07.143648 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2026-04-07 00:37:07.143658 | orchestrator | Tuesday 07 April 2026 00:36:51 +0000 (0:00:02.378) 0:00:05.555 ********* 2026-04-07 00:37:07.143669 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-07 00:37:07.143680 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-07 00:37:07.143732 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-07 00:37:07.143751 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-07 00:37:07.143767 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-07 00:37:07.143782 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-07 00:37:07.143792 | orchestrator | 2026-04-07 00:37:07.143829 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2026-04-07 00:37:07.143840 | orchestrator | Tuesday 07 April 2026 00:36:52 +0000 (0:00:01.369) 0:00:06.925 ********* 2026-04-07 00:37:07.143849 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:37:07.143859 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:37:07.143869 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:37:07.143879 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:37:07.143888 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:37:07.143898 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:37:07.143908 | orchestrator | 2026-04-07 00:37:07.143918 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2026-04-07 00:37:07.143927 | orchestrator | Tuesday 07 April 2026 00:36:56 +0000 (0:00:03.710) 0:00:10.635 ********* 2026-04-07 00:37:07.143937 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:37:07.143947 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:37:07.143956 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:37:07.143966 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:37:07.143976 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:37:07.143985 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:37:07.143995 | orchestrator | 2026-04-07 00:37:07.144005 | orchestrator | PLAY [Add a workaround service] ************************************************ 2026-04-07 00:37:07.144015 | orchestrator | 2026-04-07 00:37:07.144039 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2026-04-07 00:37:07.144049 | orchestrator | Tuesday 07 April 2026 00:36:57 +0000 (0:00:00.512) 0:00:11.147 ********* 2026-04-07 00:37:07.144058 | orchestrator | changed: [testbed-manager] 2026-04-07 00:37:07.144069 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:37:07.144079 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:37:07.144089 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:37:07.144098 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:37:07.144108 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:37:07.144118 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:37:07.144127 | orchestrator | 2026-04-07 00:37:07.144137 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2026-04-07 00:37:07.144147 | orchestrator | Tuesday 07 April 2026 00:36:58 +0000 (0:00:01.728) 0:00:12.875 ********* 2026-04-07 00:37:07.144157 | orchestrator | changed: [testbed-manager] 2026-04-07 00:37:07.144167 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:37:07.144176 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:37:07.144186 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:37:07.144195 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:37:07.144205 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:37:07.144233 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:37:07.144243 | orchestrator | 2026-04-07 00:37:07.144253 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2026-04-07 00:37:07.144263 | orchestrator | Tuesday 07 April 2026 00:37:00 +0000 (0:00:01.436) 0:00:14.312 ********* 2026-04-07 00:37:07.144273 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:37:07.144283 | orchestrator | ok: [testbed-manager] 2026-04-07 00:37:07.144292 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:37:07.144302 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:37:07.144312 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:37:07.144321 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:37:07.144331 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:37:07.144340 | orchestrator | 2026-04-07 00:37:07.144350 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2026-04-07 00:37:07.144360 | orchestrator | Tuesday 07 April 2026 00:37:01 +0000 (0:00:01.594) 0:00:15.907 ********* 2026-04-07 00:37:07.144370 | orchestrator | changed: [testbed-manager] 2026-04-07 00:37:07.144379 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:37:07.144389 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:37:07.144399 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:37:07.144409 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:37:07.144426 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:37:07.144436 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:37:07.144445 | orchestrator | 2026-04-07 00:37:07.144455 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2026-04-07 00:37:07.144465 | orchestrator | Tuesday 07 April 2026 00:37:03 +0000 (0:00:01.704) 0:00:17.611 ********* 2026-04-07 00:37:07.144475 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:37:07.144484 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:37:07.144494 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:37:07.144504 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:37:07.144513 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:37:07.144523 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:37:07.144532 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:37:07.144542 | orchestrator | 2026-04-07 00:37:07.144552 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2026-04-07 00:37:07.144562 | orchestrator | 2026-04-07 00:37:07.144572 | orchestrator | TASK [Install python3-docker] ************************************************** 2026-04-07 00:37:07.144581 | orchestrator | Tuesday 07 April 2026 00:37:04 +0000 (0:00:00.691) 0:00:18.303 ********* 2026-04-07 00:37:07.144591 | orchestrator | ok: [testbed-manager] 2026-04-07 00:37:07.144601 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:37:07.144611 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:37:07.144620 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:37:07.144630 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:37:07.144640 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:37:07.144649 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:37:07.144659 | orchestrator | 2026-04-07 00:37:07.144669 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:37:07.144680 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:37:07.144710 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:07.144721 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:07.144731 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:07.144740 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:07.144750 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:07.144768 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:07.144784 | orchestrator | 2026-04-07 00:37:07.144801 | orchestrator | 2026-04-07 00:37:07.144819 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:37:07.144836 | orchestrator | Tuesday 07 April 2026 00:37:07 +0000 (0:00:02.743) 0:00:21.047 ********* 2026-04-07 00:37:07.144853 | orchestrator | =============================================================================== 2026-04-07 00:37:07.144870 | orchestrator | Run update-ca-certificates ---------------------------------------------- 3.71s 2026-04-07 00:37:07.144894 | orchestrator | Install python3-docker -------------------------------------------------- 2.74s 2026-04-07 00:37:07.144913 | orchestrator | Apply netplan configuration --------------------------------------------- 2.43s 2026-04-07 00:37:07.144930 | orchestrator | Apply netplan configuration --------------------------------------------- 2.38s 2026-04-07 00:37:07.144948 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.73s 2026-04-07 00:37:07.144976 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 1.70s 2026-04-07 00:37:07.144994 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.59s 2026-04-07 00:37:07.145004 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.44s 2026-04-07 00:37:07.145013 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.37s 2026-04-07 00:37:07.145030 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.69s 2026-04-07 00:37:07.145046 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.61s 2026-04-07 00:37:07.145072 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.51s 2026-04-07 00:37:07.564002 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2026-04-07 00:37:18.848572 | orchestrator | 2026-04-07 00:37:18 | INFO  | Prepare task for execution of reboot. 2026-04-07 00:37:18.926347 | orchestrator | 2026-04-07 00:37:18 | INFO  | Task aba7ed0f-19d0-4144-8366-24e96a104d15 (reboot) was prepared for execution. 2026-04-07 00:37:18.926449 | orchestrator | 2026-04-07 00:37:18 | INFO  | It takes a moment until task aba7ed0f-19d0-4144-8366-24e96a104d15 (reboot) has been started and output is visible here. 2026-04-07 00:37:29.793116 | orchestrator | 2026-04-07 00:37:29.793236 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-07 00:37:29.793255 | orchestrator | 2026-04-07 00:37:29.793266 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-07 00:37:29.793279 | orchestrator | Tuesday 07 April 2026 00:37:22 +0000 (0:00:00.221) 0:00:00.221 ********* 2026-04-07 00:37:29.793290 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:37:29.793303 | orchestrator | 2026-04-07 00:37:29.793314 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-07 00:37:29.793325 | orchestrator | Tuesday 07 April 2026 00:37:22 +0000 (0:00:00.122) 0:00:00.343 ********* 2026-04-07 00:37:29.793337 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:37:29.793348 | orchestrator | 2026-04-07 00:37:29.793359 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-07 00:37:29.793371 | orchestrator | Tuesday 07 April 2026 00:37:23 +0000 (0:00:01.265) 0:00:01.609 ********* 2026-04-07 00:37:29.793384 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:37:29.793396 | orchestrator | 2026-04-07 00:37:29.793409 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-07 00:37:29.793421 | orchestrator | 2026-04-07 00:37:29.793433 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-07 00:37:29.793446 | orchestrator | Tuesday 07 April 2026 00:37:23 +0000 (0:00:00.095) 0:00:01.705 ********* 2026-04-07 00:37:29.793458 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:37:29.793471 | orchestrator | 2026-04-07 00:37:29.793484 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-07 00:37:29.793496 | orchestrator | Tuesday 07 April 2026 00:37:23 +0000 (0:00:00.084) 0:00:01.790 ********* 2026-04-07 00:37:29.793508 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:37:29.793520 | orchestrator | 2026-04-07 00:37:29.793532 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-07 00:37:29.793544 | orchestrator | Tuesday 07 April 2026 00:37:24 +0000 (0:00:01.027) 0:00:02.818 ********* 2026-04-07 00:37:29.793552 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:37:29.793559 | orchestrator | 2026-04-07 00:37:29.793566 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-07 00:37:29.793573 | orchestrator | 2026-04-07 00:37:29.793580 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-07 00:37:29.793587 | orchestrator | Tuesday 07 April 2026 00:37:24 +0000 (0:00:00.105) 0:00:02.923 ********* 2026-04-07 00:37:29.793594 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:37:29.793600 | orchestrator | 2026-04-07 00:37:29.793607 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-07 00:37:29.793639 | orchestrator | Tuesday 07 April 2026 00:37:24 +0000 (0:00:00.092) 0:00:03.015 ********* 2026-04-07 00:37:29.793646 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:37:29.793653 | orchestrator | 2026-04-07 00:37:29.793660 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-07 00:37:29.793668 | orchestrator | Tuesday 07 April 2026 00:37:25 +0000 (0:00:01.016) 0:00:04.032 ********* 2026-04-07 00:37:29.793727 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:37:29.793735 | orchestrator | 2026-04-07 00:37:29.793743 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-07 00:37:29.793750 | orchestrator | 2026-04-07 00:37:29.793758 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-07 00:37:29.793766 | orchestrator | Tuesday 07 April 2026 00:37:26 +0000 (0:00:00.103) 0:00:04.135 ********* 2026-04-07 00:37:29.793773 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:37:29.793782 | orchestrator | 2026-04-07 00:37:29.793790 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-07 00:37:29.793798 | orchestrator | Tuesday 07 April 2026 00:37:26 +0000 (0:00:00.088) 0:00:04.224 ********* 2026-04-07 00:37:29.793806 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:37:29.793813 | orchestrator | 2026-04-07 00:37:29.793821 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-07 00:37:29.793829 | orchestrator | Tuesday 07 April 2026 00:37:27 +0000 (0:00:01.014) 0:00:05.238 ********* 2026-04-07 00:37:29.793836 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:37:29.793844 | orchestrator | 2026-04-07 00:37:29.793852 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-07 00:37:29.793860 | orchestrator | 2026-04-07 00:37:29.793868 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-07 00:37:29.793876 | orchestrator | Tuesday 07 April 2026 00:37:27 +0000 (0:00:00.093) 0:00:05.332 ********* 2026-04-07 00:37:29.793885 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:37:29.793893 | orchestrator | 2026-04-07 00:37:29.793901 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-07 00:37:29.793909 | orchestrator | Tuesday 07 April 2026 00:37:27 +0000 (0:00:00.154) 0:00:05.487 ********* 2026-04-07 00:37:29.793917 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:37:29.793924 | orchestrator | 2026-04-07 00:37:29.793932 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-07 00:37:29.793940 | orchestrator | Tuesday 07 April 2026 00:37:28 +0000 (0:00:00.993) 0:00:06.480 ********* 2026-04-07 00:37:29.793948 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:37:29.793955 | orchestrator | 2026-04-07 00:37:29.793963 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-07 00:37:29.793971 | orchestrator | 2026-04-07 00:37:29.793978 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-07 00:37:29.793986 | orchestrator | Tuesday 07 April 2026 00:37:28 +0000 (0:00:00.084) 0:00:06.565 ********* 2026-04-07 00:37:29.793994 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:37:29.794002 | orchestrator | 2026-04-07 00:37:29.794010 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-07 00:37:29.794069 | orchestrator | Tuesday 07 April 2026 00:37:28 +0000 (0:00:00.073) 0:00:06.639 ********* 2026-04-07 00:37:29.794081 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:37:29.794095 | orchestrator | 2026-04-07 00:37:29.794106 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-07 00:37:29.794116 | orchestrator | Tuesday 07 April 2026 00:37:29 +0000 (0:00:01.061) 0:00:07.700 ********* 2026-04-07 00:37:29.794147 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:37:29.794160 | orchestrator | 2026-04-07 00:37:29.794170 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:37:29.794182 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:29.794206 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:29.794218 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:29.794230 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:29.794242 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:29.794253 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:37:29.794265 | orchestrator | 2026-04-07 00:37:29.794276 | orchestrator | 2026-04-07 00:37:29.794287 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:37:29.794299 | orchestrator | Tuesday 07 April 2026 00:37:29 +0000 (0:00:00.032) 0:00:07.733 ********* 2026-04-07 00:37:29.794311 | orchestrator | =============================================================================== 2026-04-07 00:37:29.794323 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 6.38s 2026-04-07 00:37:29.794336 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.62s 2026-04-07 00:37:29.794348 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.52s 2026-04-07 00:37:29.907816 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2026-04-07 00:37:41.158506 | orchestrator | 2026-04-07 00:37:41 | INFO  | Prepare task for execution of wait-for-connection. 2026-04-07 00:37:41.236968 | orchestrator | 2026-04-07 00:37:41 | INFO  | Task 60515bc4-aa6d-4291-b333-0836c3d5b607 (wait-for-connection) was prepared for execution. 2026-04-07 00:37:41.237051 | orchestrator | 2026-04-07 00:37:41 | INFO  | It takes a moment until task 60515bc4-aa6d-4291-b333-0836c3d5b607 (wait-for-connection) has been started and output is visible here. 2026-04-07 00:37:55.914315 | orchestrator | 2026-04-07 00:37:55.914438 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2026-04-07 00:37:55.914456 | orchestrator | 2026-04-07 00:37:55.914469 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2026-04-07 00:37:55.914481 | orchestrator | Tuesday 07 April 2026 00:37:44 +0000 (0:00:00.225) 0:00:00.225 ********* 2026-04-07 00:37:55.914492 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:37:55.914505 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:37:55.914516 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:37:55.914527 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:37:55.914538 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:37:55.914549 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:37:55.914560 | orchestrator | 2026-04-07 00:37:55.914571 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:37:55.914582 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:37:55.914623 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:37:55.914636 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:37:55.914647 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:37:55.914718 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:37:55.914730 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:37:55.914766 | orchestrator | 2026-04-07 00:37:55.914778 | orchestrator | 2026-04-07 00:37:55.914789 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:37:55.914800 | orchestrator | Tuesday 07 April 2026 00:37:55 +0000 (0:00:11.467) 0:00:11.693 ********* 2026-04-07 00:37:55.914811 | orchestrator | =============================================================================== 2026-04-07 00:37:55.914822 | orchestrator | Wait until remote system is reachable ---------------------------------- 11.47s 2026-04-07 00:37:56.039422 | orchestrator | + osism apply hddtemp 2026-04-07 00:38:07.200533 | orchestrator | 2026-04-07 00:38:07 | INFO  | Prepare task for execution of hddtemp. 2026-04-07 00:38:07.276860 | orchestrator | 2026-04-07 00:38:07 | INFO  | Task 3ba832b8-3edf-4d18-9927-b2966813d3f0 (hddtemp) was prepared for execution. 2026-04-07 00:38:07.276965 | orchestrator | 2026-04-07 00:38:07 | INFO  | It takes a moment until task 3ba832b8-3edf-4d18-9927-b2966813d3f0 (hddtemp) has been started and output is visible here. 2026-04-07 00:38:34.422255 | orchestrator | 2026-04-07 00:38:34.422373 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2026-04-07 00:38:34.422390 | orchestrator | 2026-04-07 00:38:34.422403 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2026-04-07 00:38:34.422414 | orchestrator | Tuesday 07 April 2026 00:38:10 +0000 (0:00:00.336) 0:00:00.336 ********* 2026-04-07 00:38:34.422425 | orchestrator | ok: [testbed-manager] 2026-04-07 00:38:34.422437 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:38:34.422449 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:38:34.422459 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:38:34.422471 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:38:34.422481 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:38:34.422492 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:38:34.422503 | orchestrator | 2026-04-07 00:38:34.422514 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2026-04-07 00:38:34.422525 | orchestrator | Tuesday 07 April 2026 00:38:11 +0000 (0:00:00.591) 0:00:00.927 ********* 2026-04-07 00:38:34.422538 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:38:34.422552 | orchestrator | 2026-04-07 00:38:34.422563 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2026-04-07 00:38:34.422574 | orchestrator | Tuesday 07 April 2026 00:38:12 +0000 (0:00:01.101) 0:00:02.028 ********* 2026-04-07 00:38:34.422584 | orchestrator | ok: [testbed-manager] 2026-04-07 00:38:34.422595 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:38:34.422606 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:38:34.422617 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:38:34.422685 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:38:34.422698 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:38:34.422709 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:38:34.422719 | orchestrator | 2026-04-07 00:38:34.422730 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2026-04-07 00:38:34.422741 | orchestrator | Tuesday 07 April 2026 00:38:14 +0000 (0:00:02.503) 0:00:04.532 ********* 2026-04-07 00:38:34.422752 | orchestrator | changed: [testbed-manager] 2026-04-07 00:38:34.422764 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:38:34.422776 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:38:34.422787 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:38:34.422801 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:38:34.422813 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:38:34.422826 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:38:34.422838 | orchestrator | 2026-04-07 00:38:34.422851 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2026-04-07 00:38:34.422864 | orchestrator | Tuesday 07 April 2026 00:38:15 +0000 (0:00:00.966) 0:00:05.498 ********* 2026-04-07 00:38:34.422902 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:38:34.422916 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:38:34.422928 | orchestrator | ok: [testbed-manager] 2026-04-07 00:38:34.422940 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:38:34.422953 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:38:34.422966 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:38:34.422977 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:38:34.422988 | orchestrator | 2026-04-07 00:38:34.422999 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2026-04-07 00:38:34.423010 | orchestrator | Tuesday 07 April 2026 00:38:16 +0000 (0:00:01.308) 0:00:06.807 ********* 2026-04-07 00:38:34.423020 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:38:34.423031 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:38:34.423042 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:38:34.423053 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:38:34.423063 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:38:34.423074 | orchestrator | changed: [testbed-manager] 2026-04-07 00:38:34.423085 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:38:34.423095 | orchestrator | 2026-04-07 00:38:34.423106 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2026-04-07 00:38:34.423117 | orchestrator | Tuesday 07 April 2026 00:38:17 +0000 (0:00:00.622) 0:00:07.429 ********* 2026-04-07 00:38:34.423141 | orchestrator | changed: [testbed-manager] 2026-04-07 00:38:34.423153 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:38:34.423164 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:38:34.423174 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:38:34.423185 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:38:34.423196 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:38:34.423206 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:38:34.423217 | orchestrator | 2026-04-07 00:38:34.423228 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2026-04-07 00:38:34.423238 | orchestrator | Tuesday 07 April 2026 00:38:31 +0000 (0:00:13.712) 0:00:21.142 ********* 2026-04-07 00:38:34.423250 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:38:34.423261 | orchestrator | 2026-04-07 00:38:34.423272 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2026-04-07 00:38:34.423283 | orchestrator | Tuesday 07 April 2026 00:38:32 +0000 (0:00:00.996) 0:00:22.139 ********* 2026-04-07 00:38:34.423294 | orchestrator | changed: [testbed-manager] 2026-04-07 00:38:34.423304 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:38:34.423315 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:38:34.423326 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:38:34.423337 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:38:34.423347 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:38:34.423358 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:38:34.423369 | orchestrator | 2026-04-07 00:38:34.423380 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:38:34.423391 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:38:34.423421 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:38:34.423434 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:38:34.423445 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:38:34.423465 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:38:34.423476 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:38:34.423487 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-07 00:38:34.423498 | orchestrator | 2026-04-07 00:38:34.423509 | orchestrator | 2026-04-07 00:38:34.423520 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:38:34.423531 | orchestrator | Tuesday 07 April 2026 00:38:34 +0000 (0:00:01.845) 0:00:23.984 ********* 2026-04-07 00:38:34.423542 | orchestrator | =============================================================================== 2026-04-07 00:38:34.423552 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 13.71s 2026-04-07 00:38:34.423563 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 2.50s 2026-04-07 00:38:34.423574 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 1.85s 2026-04-07 00:38:34.423585 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.31s 2026-04-07 00:38:34.423596 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.10s 2026-04-07 00:38:34.423607 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 1.00s 2026-04-07 00:38:34.423618 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 0.97s 2026-04-07 00:38:34.423654 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.62s 2026-04-07 00:38:34.423666 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.59s 2026-04-07 00:38:34.598128 | orchestrator | ++ semver 10.0.0 7.1.1 2026-04-07 00:38:34.633800 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-07 00:38:34.633889 | orchestrator | + sudo systemctl restart manager.service 2026-04-07 00:38:48.071785 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2026-04-07 00:38:48.071887 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2026-04-07 00:38:48.071902 | orchestrator | + local max_attempts=60 2026-04-07 00:38:48.071913 | orchestrator | + local name=ceph-ansible 2026-04-07 00:38:48.071923 | orchestrator | + local attempt_num=1 2026-04-07 00:38:48.071933 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:38:48.108662 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:38:48.108736 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:38:48.108745 | orchestrator | + sleep 5 2026-04-07 00:38:53.113076 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:38:53.137270 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:38:53.137367 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:38:53.137387 | orchestrator | + sleep 5 2026-04-07 00:38:58.140498 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:38:58.175243 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:38:58.175339 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:38:58.175353 | orchestrator | + sleep 5 2026-04-07 00:39:03.178626 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:03.208438 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:03.208520 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:03.208555 | orchestrator | + sleep 5 2026-04-07 00:39:08.213045 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:08.249893 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:08.250012 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:08.250078 | orchestrator | + sleep 5 2026-04-07 00:39:13.253774 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:13.292855 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:13.292924 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:13.292931 | orchestrator | + sleep 5 2026-04-07 00:39:18.297711 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:18.333500 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:18.333589 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:18.333624 | orchestrator | + sleep 5 2026-04-07 00:39:23.338558 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:23.369470 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:23.369571 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:23.369585 | orchestrator | + sleep 5 2026-04-07 00:39:28.372110 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:28.402584 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:28.402736 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:28.402764 | orchestrator | + sleep 5 2026-04-07 00:39:33.405506 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:33.440558 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:33.440832 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:33.440853 | orchestrator | + sleep 5 2026-04-07 00:39:38.444816 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:38.480971 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:38.481091 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:38.481118 | orchestrator | + sleep 5 2026-04-07 00:39:43.486198 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:43.520698 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:43.520800 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:43.520817 | orchestrator | + sleep 5 2026-04-07 00:39:48.526351 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:48.562547 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:48.562709 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-07 00:39:48.562727 | orchestrator | + sleep 5 2026-04-07 00:39:53.567191 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-07 00:39:53.601911 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:53.602088 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2026-04-07 00:39:53.602111 | orchestrator | + local max_attempts=60 2026-04-07 00:39:53.602125 | orchestrator | + local name=kolla-ansible 2026-04-07 00:39:53.602136 | orchestrator | + local attempt_num=1 2026-04-07 00:39:53.602148 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2026-04-07 00:39:53.630089 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:53.630212 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2026-04-07 00:39:53.630237 | orchestrator | + local max_attempts=60 2026-04-07 00:39:53.630251 | orchestrator | + local name=osism-ansible 2026-04-07 00:39:53.630263 | orchestrator | + local attempt_num=1 2026-04-07 00:39:53.630274 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2026-04-07 00:39:53.655462 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-07 00:39:53.655565 | orchestrator | + [[ true == \t\r\u\e ]] 2026-04-07 00:39:53.655610 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2026-04-07 00:39:53.794441 | orchestrator | ARA in ceph-ansible already disabled. 2026-04-07 00:39:53.910090 | orchestrator | ARA in kolla-ansible already disabled. 2026-04-07 00:39:54.042530 | orchestrator | ARA in osism-ansible already disabled. 2026-04-07 00:39:54.176408 | orchestrator | ARA in osism-kubernetes already disabled. 2026-04-07 00:39:54.176877 | orchestrator | + osism apply gather-facts 2026-04-07 00:40:05.358927 | orchestrator | 2026-04-07 00:40:05 | INFO  | Prepare task for execution of gather-facts. 2026-04-07 00:40:05.427308 | orchestrator | 2026-04-07 00:40:05 | INFO  | Task 0483d0d3-b2e7-4ea5-8a44-f7d5ba9a4740 (gather-facts) was prepared for execution. 2026-04-07 00:40:05.427433 | orchestrator | 2026-04-07 00:40:05 | INFO  | It takes a moment until task 0483d0d3-b2e7-4ea5-8a44-f7d5ba9a4740 (gather-facts) has been started and output is visible here. 2026-04-07 00:40:08.994313 | orchestrator | [WARNING]: Invalid characters were found in group names but not replaced, use 2026-04-07 00:40:08.994431 | orchestrator | -vvvv to see details 2026-04-07 00:40:08.994460 | orchestrator | 2026-04-07 00:40:08.994474 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-07 00:40:08.994516 | orchestrator | 2026-04-07 00:40:08.994529 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-07 00:40:08.994543 | orchestrator | fatal: [testbed-manager]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.5\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.5: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994556 | orchestrator | ...ignoring 2026-04-07 00:40:08.994606 | orchestrator | fatal: [testbed-node-2]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.12\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.12: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994618 | orchestrator | ...ignoring 2026-04-07 00:40:08.994629 | orchestrator | fatal: [testbed-node-1]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.11\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.11: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994640 | orchestrator | ...ignoring 2026-04-07 00:40:08.994667 | orchestrator | fatal: [testbed-node-0]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.10\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.10: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994678 | orchestrator | ...ignoring 2026-04-07 00:40:08.994695 | orchestrator | fatal: [testbed-node-5]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.15\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.15: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994715 | orchestrator | ...ignoring 2026-04-07 00:40:08.994735 | orchestrator | fatal: [testbed-node-4]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.14\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.14: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994755 | orchestrator | ...ignoring 2026-04-07 00:40:08.994775 | orchestrator | fatal: [testbed-node-3]: UNREACHABLE! => {"changed": false, "msg": "Data could not be sent to remote host \"192.168.16.13\". Make sure this host can be reached over ssh: no such identity: /ansible/secrets/id_rsa: No such file or directory\r\ndragon@192.168.16.13: Permission denied (publickey).\r\n", "unreachable": true} 2026-04-07 00:40:08.994794 | orchestrator | ...ignoring 2026-04-07 00:40:08.994805 | orchestrator | 2026-04-07 00:40:08.994816 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-04-07 00:40:08.994827 | orchestrator | 2026-04-07 00:40:08.994838 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-04-07 00:40:08.994849 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:40:08.994865 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:40:08.994879 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:40:08.994891 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:40:08.994904 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:40:08.994918 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:40:08.994930 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:40:08.994942 | orchestrator | 2026-04-07 00:40:08.994956 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:40:08.994969 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.994993 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.995007 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.995021 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.995054 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.995065 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.995076 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:40:08.995088 | orchestrator | 2026-04-07 00:40:09.105729 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2026-04-07 00:40:09.117150 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2026-04-07 00:40:09.126839 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2026-04-07 00:40:09.136445 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2026-04-07 00:40:09.147867 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2026-04-07 00:40:09.155211 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/320-openstack-minimal.sh /usr/local/bin/deploy-openstack-minimal 2026-04-07 00:40:09.164456 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2026-04-07 00:40:09.172367 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2026-04-07 00:40:09.181670 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2026-04-07 00:40:09.189818 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade-manager.sh /usr/local/bin/upgrade-manager 2026-04-07 00:40:09.198372 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2026-04-07 00:40:09.208265 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2026-04-07 00:40:09.219457 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2026-04-07 00:40:09.228289 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2026-04-07 00:40:09.239750 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/320-openstack-minimal.sh /usr/local/bin/upgrade-openstack-minimal 2026-04-07 00:40:09.249849 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2026-04-07 00:40:09.263353 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2026-04-07 00:40:09.273326 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2026-04-07 00:40:09.288350 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2026-04-07 00:40:09.299477 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amhpora-image.sh /usr/local/bin/bootstrap-octavia 2026-04-07 00:40:09.315779 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2026-04-07 00:40:09.329762 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2026-04-07 00:40:09.347365 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2026-04-07 00:40:09.366245 | orchestrator | + [[ false == \t\r\u\e ]] 2026-04-07 00:40:09.493946 | orchestrator | ok: Runtime: 0:23:41.251475 2026-04-07 00:40:09.603263 | 2026-04-07 00:40:09.603477 | TASK [Deploy services] 2026-04-07 00:40:10.136644 | orchestrator | skipping: Conditional result was False 2026-04-07 00:40:10.146949 | 2026-04-07 00:40:10.147090 | TASK [Deploy in a nutshell] 2026-04-07 00:40:10.836137 | orchestrator | + set -e 2026-04-07 00:40:10.836328 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-04-07 00:40:10.836353 | orchestrator | ++ export INTERACTIVE=false 2026-04-07 00:40:10.836374 | orchestrator | ++ INTERACTIVE=false 2026-04-07 00:40:10.836389 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-04-07 00:40:10.836401 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-04-07 00:40:10.836415 | orchestrator | + source /opt/manager-vars.sh 2026-04-07 00:40:10.836460 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-04-07 00:40:10.836489 | orchestrator | ++ NUMBER_OF_NODES=6 2026-04-07 00:40:10.836504 | orchestrator | ++ export CEPH_VERSION= 2026-04-07 00:40:10.836519 | orchestrator | ++ CEPH_VERSION= 2026-04-07 00:40:10.836532 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-04-07 00:40:10.836558 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-04-07 00:40:10.836621 | orchestrator | ++ export MANAGER_VERSION=10.0.0 2026-04-07 00:40:10.836652 | orchestrator | ++ MANAGER_VERSION=10.0.0 2026-04-07 00:40:10.836670 | orchestrator | ++ export OPENSTACK_VERSION= 2026-04-07 00:40:10.836688 | orchestrator | ++ OPENSTACK_VERSION= 2026-04-07 00:40:10.836713 | orchestrator | ++ export ARA=false 2026-04-07 00:40:10.836734 | orchestrator | ++ ARA=false 2026-04-07 00:40:10.836752 | orchestrator | ++ export DEPLOY_MODE=manager 2026-04-07 00:40:10.836777 | orchestrator | ++ DEPLOY_MODE=manager 2026-04-07 00:40:10.836796 | orchestrator | ++ export TEMPEST=true 2026-04-07 00:40:10.836814 | orchestrator | ++ TEMPEST=true 2026-04-07 00:40:10.836833 | orchestrator | ++ export IS_ZUUL=true 2026-04-07 00:40:10.836852 | orchestrator | ++ IS_ZUUL=true 2026-04-07 00:40:10.836872 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:40:10.836892 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.120 2026-04-07 00:40:10.836911 | orchestrator | ++ export EXTERNAL_API=false 2026-04-07 00:40:10.836930 | orchestrator | ++ EXTERNAL_API=false 2026-04-07 00:40:10.836942 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-04-07 00:40:10.836954 | orchestrator | ++ IMAGE_USER=ubuntu 2026-04-07 00:40:10.836965 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-04-07 00:40:10.836976 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-04-07 00:40:10.836987 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-04-07 00:40:10.836998 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-04-07 00:40:10.837025 | orchestrator | 2026-04-07 00:40:10.837076 | orchestrator | # PULL IMAGES 2026-04-07 00:40:10.837090 | orchestrator | 2026-04-07 00:40:10.837101 | orchestrator | + echo 2026-04-07 00:40:10.837115 | orchestrator | + echo '# PULL IMAGES' 2026-04-07 00:40:10.837134 | orchestrator | + echo 2026-04-07 00:40:10.837159 | orchestrator | ++ semver 10.0.0 7.0.0 2026-04-07 00:40:10.885538 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-07 00:40:10.885706 | orchestrator | + osism apply --no-wait -r 2 -e custom pull-images 2026-04-07 00:40:12.031401 | orchestrator | 2026-04-07 00:40:12 | INFO  | Trying to run play pull-images in environment custom 2026-04-07 00:40:22.058691 | orchestrator | 2026-04-07 00:40:22 | INFO  | Prepare task for execution of pull-images. 2026-04-07 00:40:22.127077 | orchestrator | 2026-04-07 00:40:22 | INFO  | Task 017a15f2-f69b-47e9-b244-4535ef28c0c6 (pull-images) was prepared for execution. 2026-04-07 00:40:22.127214 | orchestrator | 2026-04-07 00:40:22 | INFO  | Task 017a15f2-f69b-47e9-b244-4535ef28c0c6 is running in background. No more output. Check ARA for logs. 2026-04-07 00:40:23.423599 | orchestrator | 2026-04-07 00:40:23 | INFO  | Trying to run play wipe-partitions in environment custom 2026-04-07 00:40:33.472949 | orchestrator | 2026-04-07 00:40:33 | INFO  | Prepare task for execution of wipe-partitions. 2026-04-07 00:40:33.561590 | orchestrator | 2026-04-07 00:40:33 | INFO  | Task b07176ee-5c74-4b92-9f8b-ff38c2f53512 (wipe-partitions) was prepared for execution. 2026-04-07 00:40:33.561681 | orchestrator | 2026-04-07 00:40:33 | INFO  | It takes a moment until task b07176ee-5c74-4b92-9f8b-ff38c2f53512 (wipe-partitions) has been started and output is visible here. 2026-04-07 00:40:44.943516 | orchestrator | 2026-04-07 00:40:44.943717 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2026-04-07 00:40:44.943730 | orchestrator | 2026-04-07 00:40:44.943737 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2026-04-07 00:40:44.943752 | orchestrator | Tuesday 07 April 2026 00:40:36 +0000 (0:00:00.145) 0:00:00.145 ********* 2026-04-07 00:40:44.943760 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:40:44.943794 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:40:44.943801 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:40:44.943807 | orchestrator | 2026-04-07 00:40:44.943813 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2026-04-07 00:40:44.943818 | orchestrator | Tuesday 07 April 2026 00:40:37 +0000 (0:00:01.146) 0:00:01.292 ********* 2026-04-07 00:40:44.943824 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:40:44.943834 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:40:44.943840 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:40:44.943846 | orchestrator | 2026-04-07 00:40:44.943851 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2026-04-07 00:40:44.943857 | orchestrator | Tuesday 07 April 2026 00:40:37 +0000 (0:00:00.224) 0:00:01.516 ********* 2026-04-07 00:40:44.943863 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:40:44.943870 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:40:44.943875 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:40:44.943881 | orchestrator | 2026-04-07 00:40:44.943887 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2026-04-07 00:40:44.943892 | orchestrator | Tuesday 07 April 2026 00:40:38 +0000 (0:00:00.545) 0:00:02.061 ********* 2026-04-07 00:40:44.943898 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:40:44.943903 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:40:44.943909 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:40:44.943914 | orchestrator | 2026-04-07 00:40:44.943920 | orchestrator | TASK [Check device availability] *********************************************** 2026-04-07 00:40:44.943926 | orchestrator | Tuesday 07 April 2026 00:40:38 +0000 (0:00:00.217) 0:00:02.279 ********* 2026-04-07 00:40:44.943931 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2026-04-07 00:40:44.943939 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2026-04-07 00:40:44.943945 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2026-04-07 00:40:44.943950 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2026-04-07 00:40:44.943956 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2026-04-07 00:40:44.943961 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2026-04-07 00:40:44.943966 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2026-04-07 00:40:44.943972 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2026-04-07 00:40:44.943977 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2026-04-07 00:40:44.943983 | orchestrator | 2026-04-07 00:40:44.943989 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2026-04-07 00:40:44.943995 | orchestrator | Tuesday 07 April 2026 00:40:40 +0000 (0:00:01.297) 0:00:03.577 ********* 2026-04-07 00:40:44.944001 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2026-04-07 00:40:44.944008 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2026-04-07 00:40:44.944014 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2026-04-07 00:40:44.944021 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2026-04-07 00:40:44.944027 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2026-04-07 00:40:44.944033 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2026-04-07 00:40:44.944040 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2026-04-07 00:40:44.944046 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2026-04-07 00:40:44.944053 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2026-04-07 00:40:44.944059 | orchestrator | 2026-04-07 00:40:44.944066 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2026-04-07 00:40:44.944072 | orchestrator | Tuesday 07 April 2026 00:40:41 +0000 (0:00:01.387) 0:00:04.965 ********* 2026-04-07 00:40:44.944078 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2026-04-07 00:40:44.944085 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2026-04-07 00:40:44.944091 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2026-04-07 00:40:44.944098 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2026-04-07 00:40:44.944104 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2026-04-07 00:40:44.944120 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2026-04-07 00:40:44.944127 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2026-04-07 00:40:44.944133 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2026-04-07 00:40:44.944140 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2026-04-07 00:40:44.944146 | orchestrator | 2026-04-07 00:40:44.944153 | orchestrator | TASK [Reload udev rules] ******************************************************* 2026-04-07 00:40:44.944159 | orchestrator | Tuesday 07 April 2026 00:40:43 +0000 (0:00:02.066) 0:00:07.032 ********* 2026-04-07 00:40:44.944165 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:40:44.944172 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:40:44.944178 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:40:44.944184 | orchestrator | 2026-04-07 00:40:44.944191 | orchestrator | TASK [Request device events from the kernel] *********************************** 2026-04-07 00:40:44.944198 | orchestrator | Tuesday 07 April 2026 00:40:44 +0000 (0:00:00.547) 0:00:07.579 ********* 2026-04-07 00:40:44.944204 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:40:44.944211 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:40:44.944217 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:40:44.944223 | orchestrator | 2026-04-07 00:40:44.944229 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:40:44.944236 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:40:44.944244 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:40:44.944267 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:40:44.944273 | orchestrator | 2026-04-07 00:40:44.944279 | orchestrator | 2026-04-07 00:40:44.944285 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:40:44.944291 | orchestrator | Tuesday 07 April 2026 00:40:44 +0000 (0:00:00.598) 0:00:08.177 ********* 2026-04-07 00:40:44.944296 | orchestrator | =============================================================================== 2026-04-07 00:40:44.944302 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.07s 2026-04-07 00:40:44.944307 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.39s 2026-04-07 00:40:44.944313 | orchestrator | Check device availability ----------------------------------------------- 1.30s 2026-04-07 00:40:44.944318 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 1.15s 2026-04-07 00:40:44.944324 | orchestrator | Request device events from the kernel ----------------------------------- 0.60s 2026-04-07 00:40:44.944329 | orchestrator | Reload udev rules ------------------------------------------------------- 0.55s 2026-04-07 00:40:44.944335 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.55s 2026-04-07 00:40:44.944340 | orchestrator | Remove all rook related logical devices --------------------------------- 0.22s 2026-04-07 00:40:44.944346 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.22s 2026-04-07 00:40:56.439282 | orchestrator | 2026-04-07 00:40:56 | INFO  | Prepare task for execution of facts. 2026-04-07 00:40:56.494763 | orchestrator | 2026-04-07 00:40:56 | INFO  | Task e1d2abae-e2cf-42fb-bae3-60c83ac5eeb4 (facts) was prepared for execution. 2026-04-07 00:40:56.494887 | orchestrator | 2026-04-07 00:40:56 | INFO  | It takes a moment until task e1d2abae-e2cf-42fb-bae3-60c83ac5eeb4 (facts) has been started and output is visible here. 2026-04-07 00:41:07.658320 | orchestrator | 2026-04-07 00:41:07.658423 | orchestrator | PLAY [Apply role facts] ******************************************************** 2026-04-07 00:41:07.658431 | orchestrator | 2026-04-07 00:41:07.658436 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-04-07 00:41:07.658463 | orchestrator | Tuesday 07 April 2026 00:40:59 +0000 (0:00:00.278) 0:00:00.278 ********* 2026-04-07 00:41:07.658468 | orchestrator | ok: [testbed-manager] 2026-04-07 00:41:07.658474 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:41:07.658478 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:41:07.658482 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:41:07.658486 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:41:07.658490 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:41:07.658494 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:41:07.658497 | orchestrator | 2026-04-07 00:41:07.658502 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-04-07 00:41:07.658564 | orchestrator | Tuesday 07 April 2026 00:41:00 +0000 (0:00:01.267) 0:00:01.546 ********* 2026-04-07 00:41:07.658570 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:41:07.658575 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:41:07.658579 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:41:07.658583 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:41:07.658587 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:07.658591 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:07.658595 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:07.658599 | orchestrator | 2026-04-07 00:41:07.658603 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-07 00:41:07.658606 | orchestrator | 2026-04-07 00:41:07.658610 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-07 00:41:07.658615 | orchestrator | Tuesday 07 April 2026 00:41:01 +0000 (0:00:01.001) 0:00:02.547 ********* 2026-04-07 00:41:07.658619 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:41:07.658623 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:41:07.658626 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:41:07.658630 | orchestrator | ok: [testbed-manager] 2026-04-07 00:41:07.658634 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:41:07.658638 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:41:07.658642 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:41:07.658646 | orchestrator | 2026-04-07 00:41:07.658649 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-04-07 00:41:07.658653 | orchestrator | 2026-04-07 00:41:07.658657 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-04-07 00:41:07.658661 | orchestrator | Tuesday 07 April 2026 00:41:07 +0000 (0:00:05.297) 0:00:07.845 ********* 2026-04-07 00:41:07.658665 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:41:07.658669 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:41:07.658673 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:41:07.658676 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:41:07.658680 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:07.658684 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:07.658688 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:07.658692 | orchestrator | 2026-04-07 00:41:07.658695 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:41:07.658699 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658705 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658709 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658712 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658716 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658720 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658730 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:41:07.658734 | orchestrator | 2026-04-07 00:41:07.658738 | orchestrator | 2026-04-07 00:41:07.658742 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:41:07.658746 | orchestrator | Tuesday 07 April 2026 00:41:07 +0000 (0:00:00.444) 0:00:08.290 ********* 2026-04-07 00:41:07.658750 | orchestrator | =============================================================================== 2026-04-07 00:41:07.658754 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.30s 2026-04-07 00:41:07.658758 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.27s 2026-04-07 00:41:07.658761 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.00s 2026-04-07 00:41:07.658765 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.44s 2026-04-07 00:41:08.904828 | orchestrator | 2026-04-07 00:41:08 | INFO  | Prepare task for execution of ceph-configure-lvm-volumes. 2026-04-07 00:41:08.960228 | orchestrator | 2026-04-07 00:41:08 | INFO  | Task acf1e0ee-7903-4688-9b30-ff27689852b2 (ceph-configure-lvm-volumes) was prepared for execution. 2026-04-07 00:41:08.960305 | orchestrator | 2026-04-07 00:41:08 | INFO  | It takes a moment until task acf1e0ee-7903-4688-9b30-ff27689852b2 (ceph-configure-lvm-volumes) has been started and output is visible here. 2026-04-07 00:41:19.466320 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-07 00:41:19.466395 | orchestrator | 2.16.14 2026-04-07 00:41:19.466402 | orchestrator | 2026-04-07 00:41:19.466407 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-04-07 00:41:19.466412 | orchestrator | 2026-04-07 00:41:19.466417 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-07 00:41:19.466428 | orchestrator | Tuesday 07 April 2026 00:41:12 +0000 (0:00:00.254) 0:00:00.254 ********* 2026-04-07 00:41:19.466433 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-04-07 00:41:19.466437 | orchestrator | 2026-04-07 00:41:19.466441 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-07 00:41:19.466446 | orchestrator | Tuesday 07 April 2026 00:41:13 +0000 (0:00:00.222) 0:00:00.477 ********* 2026-04-07 00:41:19.466450 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:41:19.466454 | orchestrator | 2026-04-07 00:41:19.466458 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466462 | orchestrator | Tuesday 07 April 2026 00:41:13 +0000 (0:00:00.201) 0:00:00.678 ********* 2026-04-07 00:41:19.466466 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2026-04-07 00:41:19.466471 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2026-04-07 00:41:19.466474 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2026-04-07 00:41:19.466478 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2026-04-07 00:41:19.466482 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2026-04-07 00:41:19.466486 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2026-04-07 00:41:19.466490 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2026-04-07 00:41:19.466493 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2026-04-07 00:41:19.466498 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2026-04-07 00:41:19.466502 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2026-04-07 00:41:19.466505 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2026-04-07 00:41:19.466546 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2026-04-07 00:41:19.466551 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2026-04-07 00:41:19.466554 | orchestrator | 2026-04-07 00:41:19.466558 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466562 | orchestrator | Tuesday 07 April 2026 00:41:13 +0000 (0:00:00.297) 0:00:00.976 ********* 2026-04-07 00:41:19.466566 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466570 | orchestrator | 2026-04-07 00:41:19.466574 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466578 | orchestrator | Tuesday 07 April 2026 00:41:14 +0000 (0:00:00.373) 0:00:01.350 ********* 2026-04-07 00:41:19.466582 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466586 | orchestrator | 2026-04-07 00:41:19.466589 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466593 | orchestrator | Tuesday 07 April 2026 00:41:14 +0000 (0:00:00.160) 0:00:01.510 ********* 2026-04-07 00:41:19.466601 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466605 | orchestrator | 2026-04-07 00:41:19.466609 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466612 | orchestrator | Tuesday 07 April 2026 00:41:14 +0000 (0:00:00.192) 0:00:01.703 ********* 2026-04-07 00:41:19.466616 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466621 | orchestrator | 2026-04-07 00:41:19.466625 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466628 | orchestrator | Tuesday 07 April 2026 00:41:14 +0000 (0:00:00.169) 0:00:01.873 ********* 2026-04-07 00:41:19.466632 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466636 | orchestrator | 2026-04-07 00:41:19.466640 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466644 | orchestrator | Tuesday 07 April 2026 00:41:14 +0000 (0:00:00.164) 0:00:02.037 ********* 2026-04-07 00:41:19.466648 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466652 | orchestrator | 2026-04-07 00:41:19.466656 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466660 | orchestrator | Tuesday 07 April 2026 00:41:14 +0000 (0:00:00.175) 0:00:02.213 ********* 2026-04-07 00:41:19.466663 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466667 | orchestrator | 2026-04-07 00:41:19.466671 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466675 | orchestrator | Tuesday 07 April 2026 00:41:15 +0000 (0:00:00.170) 0:00:02.383 ********* 2026-04-07 00:41:19.466679 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466683 | orchestrator | 2026-04-07 00:41:19.466687 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466691 | orchestrator | Tuesday 07 April 2026 00:41:15 +0000 (0:00:00.175) 0:00:02.558 ********* 2026-04-07 00:41:19.466695 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967) 2026-04-07 00:41:19.466700 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967) 2026-04-07 00:41:19.466704 | orchestrator | 2026-04-07 00:41:19.466708 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466721 | orchestrator | Tuesday 07 April 2026 00:41:15 +0000 (0:00:00.351) 0:00:02.909 ********* 2026-04-07 00:41:19.466726 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76) 2026-04-07 00:41:19.466730 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76) 2026-04-07 00:41:19.466733 | orchestrator | 2026-04-07 00:41:19.466737 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466741 | orchestrator | Tuesday 07 April 2026 00:41:15 +0000 (0:00:00.357) 0:00:03.266 ********* 2026-04-07 00:41:19.466750 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88) 2026-04-07 00:41:19.466754 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88) 2026-04-07 00:41:19.466758 | orchestrator | 2026-04-07 00:41:19.466762 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466765 | orchestrator | Tuesday 07 April 2026 00:41:16 +0000 (0:00:00.500) 0:00:03.767 ********* 2026-04-07 00:41:19.466769 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d) 2026-04-07 00:41:19.466773 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d) 2026-04-07 00:41:19.466777 | orchestrator | 2026-04-07 00:41:19.466781 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:19.466785 | orchestrator | Tuesday 07 April 2026 00:41:17 +0000 (0:00:00.537) 0:00:04.304 ********* 2026-04-07 00:41:19.466789 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-07 00:41:19.466792 | orchestrator | 2026-04-07 00:41:19.466796 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466800 | orchestrator | Tuesday 07 April 2026 00:41:17 +0000 (0:00:00.722) 0:00:05.027 ********* 2026-04-07 00:41:19.466804 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2026-04-07 00:41:19.466811 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2026-04-07 00:41:19.466815 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2026-04-07 00:41:19.466819 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2026-04-07 00:41:19.466823 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2026-04-07 00:41:19.466827 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2026-04-07 00:41:19.466831 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2026-04-07 00:41:19.466834 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2026-04-07 00:41:19.466838 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2026-04-07 00:41:19.466842 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2026-04-07 00:41:19.466846 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2026-04-07 00:41:19.466850 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2026-04-07 00:41:19.466853 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2026-04-07 00:41:19.466857 | orchestrator | 2026-04-07 00:41:19.466861 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466865 | orchestrator | Tuesday 07 April 2026 00:41:18 +0000 (0:00:00.356) 0:00:05.383 ********* 2026-04-07 00:41:19.466869 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466873 | orchestrator | 2026-04-07 00:41:19.466877 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466881 | orchestrator | Tuesday 07 April 2026 00:41:18 +0000 (0:00:00.194) 0:00:05.578 ********* 2026-04-07 00:41:19.466885 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466889 | orchestrator | 2026-04-07 00:41:19.466894 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466899 | orchestrator | Tuesday 07 April 2026 00:41:18 +0000 (0:00:00.194) 0:00:05.772 ********* 2026-04-07 00:41:19.466903 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466907 | orchestrator | 2026-04-07 00:41:19.466912 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466920 | orchestrator | Tuesday 07 April 2026 00:41:18 +0000 (0:00:00.196) 0:00:05.969 ********* 2026-04-07 00:41:19.466924 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466929 | orchestrator | 2026-04-07 00:41:19.466933 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466937 | orchestrator | Tuesday 07 April 2026 00:41:18 +0000 (0:00:00.199) 0:00:06.168 ********* 2026-04-07 00:41:19.466942 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466946 | orchestrator | 2026-04-07 00:41:19.466951 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466958 | orchestrator | Tuesday 07 April 2026 00:41:19 +0000 (0:00:00.189) 0:00:06.358 ********* 2026-04-07 00:41:19.466963 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466967 | orchestrator | 2026-04-07 00:41:19.466972 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:19.466976 | orchestrator | Tuesday 07 April 2026 00:41:19 +0000 (0:00:00.186) 0:00:06.544 ********* 2026-04-07 00:41:19.466981 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:19.466985 | orchestrator | 2026-04-07 00:41:19.466992 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:26.265797 | orchestrator | Tuesday 07 April 2026 00:41:19 +0000 (0:00:00.189) 0:00:06.734 ********* 2026-04-07 00:41:26.265909 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.265926 | orchestrator | 2026-04-07 00:41:26.265939 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:26.265952 | orchestrator | Tuesday 07 April 2026 00:41:19 +0000 (0:00:00.226) 0:00:06.960 ********* 2026-04-07 00:41:26.265964 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2026-04-07 00:41:26.265977 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2026-04-07 00:41:26.265988 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2026-04-07 00:41:26.266000 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2026-04-07 00:41:26.266011 | orchestrator | 2026-04-07 00:41:26.266074 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:26.266087 | orchestrator | Tuesday 07 April 2026 00:41:20 +0000 (0:00:01.014) 0:00:07.975 ********* 2026-04-07 00:41:26.266100 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266112 | orchestrator | 2026-04-07 00:41:26.266125 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:26.266138 | orchestrator | Tuesday 07 April 2026 00:41:20 +0000 (0:00:00.162) 0:00:08.138 ********* 2026-04-07 00:41:26.266150 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266162 | orchestrator | 2026-04-07 00:41:26.266174 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:26.266187 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.190) 0:00:08.328 ********* 2026-04-07 00:41:26.266199 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266211 | orchestrator | 2026-04-07 00:41:26.266223 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:26.266235 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.189) 0:00:08.518 ********* 2026-04-07 00:41:26.266248 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266259 | orchestrator | 2026-04-07 00:41:26.266272 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-04-07 00:41:26.266284 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.182) 0:00:08.700 ********* 2026-04-07 00:41:26.266296 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2026-04-07 00:41:26.266309 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2026-04-07 00:41:26.266321 | orchestrator | 2026-04-07 00:41:26.266334 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-04-07 00:41:26.266346 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.143) 0:00:08.843 ********* 2026-04-07 00:41:26.266359 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266395 | orchestrator | 2026-04-07 00:41:26.266422 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-04-07 00:41:26.266449 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.125) 0:00:08.968 ********* 2026-04-07 00:41:26.266474 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266496 | orchestrator | 2026-04-07 00:41:26.266536 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-04-07 00:41:26.266551 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.120) 0:00:09.089 ********* 2026-04-07 00:41:26.266563 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266574 | orchestrator | 2026-04-07 00:41:26.266585 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-04-07 00:41:26.266596 | orchestrator | Tuesday 07 April 2026 00:41:21 +0000 (0:00:00.126) 0:00:09.215 ********* 2026-04-07 00:41:26.266615 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:41:26.266636 | orchestrator | 2026-04-07 00:41:26.266655 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-04-07 00:41:26.266666 | orchestrator | Tuesday 07 April 2026 00:41:22 +0000 (0:00:00.127) 0:00:09.343 ********* 2026-04-07 00:41:26.266684 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e0113da9-ca02-59fe-bdca-d5482abf5fe2'}}) 2026-04-07 00:41:26.266705 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9eeb51fd-cca7-5129-bb0c-15bc93c67722'}}) 2026-04-07 00:41:26.266720 | orchestrator | 2026-04-07 00:41:26.266736 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-04-07 00:41:26.266746 | orchestrator | Tuesday 07 April 2026 00:41:22 +0000 (0:00:00.167) 0:00:09.511 ********* 2026-04-07 00:41:26.266758 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e0113da9-ca02-59fe-bdca-d5482abf5fe2'}})  2026-04-07 00:41:26.266783 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9eeb51fd-cca7-5129-bb0c-15bc93c67722'}})  2026-04-07 00:41:26.266795 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266805 | orchestrator | 2026-04-07 00:41:26.266817 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-04-07 00:41:26.266828 | orchestrator | Tuesday 07 April 2026 00:41:22 +0000 (0:00:00.137) 0:00:09.648 ********* 2026-04-07 00:41:26.266842 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e0113da9-ca02-59fe-bdca-d5482abf5fe2'}})  2026-04-07 00:41:26.266853 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9eeb51fd-cca7-5129-bb0c-15bc93c67722'}})  2026-04-07 00:41:26.266864 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266878 | orchestrator | 2026-04-07 00:41:26.266891 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-04-07 00:41:26.266905 | orchestrator | Tuesday 07 April 2026 00:41:22 +0000 (0:00:00.158) 0:00:09.806 ********* 2026-04-07 00:41:26.266920 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e0113da9-ca02-59fe-bdca-d5482abf5fe2'}})  2026-04-07 00:41:26.266953 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9eeb51fd-cca7-5129-bb0c-15bc93c67722'}})  2026-04-07 00:41:26.266967 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.266978 | orchestrator | 2026-04-07 00:41:26.266989 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-04-07 00:41:26.267001 | orchestrator | Tuesday 07 April 2026 00:41:22 +0000 (0:00:00.308) 0:00:10.115 ********* 2026-04-07 00:41:26.267012 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:41:26.267023 | orchestrator | 2026-04-07 00:41:26.267034 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-04-07 00:41:26.267046 | orchestrator | Tuesday 07 April 2026 00:41:22 +0000 (0:00:00.115) 0:00:10.231 ********* 2026-04-07 00:41:26.267057 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:41:26.267068 | orchestrator | 2026-04-07 00:41:26.267092 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-04-07 00:41:26.267103 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.119) 0:00:10.350 ********* 2026-04-07 00:41:26.267113 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.267123 | orchestrator | 2026-04-07 00:41:26.267136 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-04-07 00:41:26.267152 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.110) 0:00:10.461 ********* 2026-04-07 00:41:26.267159 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.267165 | orchestrator | 2026-04-07 00:41:26.267172 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-04-07 00:41:26.267178 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.123) 0:00:10.584 ********* 2026-04-07 00:41:26.267184 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.267190 | orchestrator | 2026-04-07 00:41:26.267197 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-04-07 00:41:26.267203 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.119) 0:00:10.704 ********* 2026-04-07 00:41:26.267209 | orchestrator | ok: [testbed-node-3] => { 2026-04-07 00:41:26.267216 | orchestrator |  "ceph_osd_devices": { 2026-04-07 00:41:26.267222 | orchestrator |  "sdb": { 2026-04-07 00:41:26.267229 | orchestrator |  "osd_lvm_uuid": "e0113da9-ca02-59fe-bdca-d5482abf5fe2" 2026-04-07 00:41:26.267235 | orchestrator |  }, 2026-04-07 00:41:26.267241 | orchestrator |  "sdc": { 2026-04-07 00:41:26.267248 | orchestrator |  "osd_lvm_uuid": "9eeb51fd-cca7-5129-bb0c-15bc93c67722" 2026-04-07 00:41:26.267254 | orchestrator |  } 2026-04-07 00:41:26.267260 | orchestrator |  } 2026-04-07 00:41:26.267266 | orchestrator | } 2026-04-07 00:41:26.267273 | orchestrator | 2026-04-07 00:41:26.267279 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-04-07 00:41:26.267285 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.121) 0:00:10.825 ********* 2026-04-07 00:41:26.267292 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.267298 | orchestrator | 2026-04-07 00:41:26.267304 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-04-07 00:41:26.267310 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.120) 0:00:10.946 ********* 2026-04-07 00:41:26.267316 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.267323 | orchestrator | 2026-04-07 00:41:26.267329 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-04-07 00:41:26.267335 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.096) 0:00:11.042 ********* 2026-04-07 00:41:26.267341 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:41:26.267347 | orchestrator | 2026-04-07 00:41:26.267354 | orchestrator | TASK [Print configuration data] ************************************************ 2026-04-07 00:41:26.267360 | orchestrator | Tuesday 07 April 2026 00:41:23 +0000 (0:00:00.117) 0:00:11.160 ********* 2026-04-07 00:41:26.267366 | orchestrator | changed: [testbed-node-3] => { 2026-04-07 00:41:26.267372 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-04-07 00:41:26.267379 | orchestrator |  "ceph_osd_devices": { 2026-04-07 00:41:26.267385 | orchestrator |  "sdb": { 2026-04-07 00:41:26.267392 | orchestrator |  "osd_lvm_uuid": "e0113da9-ca02-59fe-bdca-d5482abf5fe2" 2026-04-07 00:41:26.267398 | orchestrator |  }, 2026-04-07 00:41:26.267404 | orchestrator |  "sdc": { 2026-04-07 00:41:26.267411 | orchestrator |  "osd_lvm_uuid": "9eeb51fd-cca7-5129-bb0c-15bc93c67722" 2026-04-07 00:41:26.267417 | orchestrator |  } 2026-04-07 00:41:26.267423 | orchestrator |  }, 2026-04-07 00:41:26.267429 | orchestrator |  "lvm_volumes": [ 2026-04-07 00:41:26.267436 | orchestrator |  { 2026-04-07 00:41:26.267442 | orchestrator |  "data": "osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2", 2026-04-07 00:41:26.267449 | orchestrator |  "data_vg": "ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2" 2026-04-07 00:41:26.267455 | orchestrator |  }, 2026-04-07 00:41:26.267466 | orchestrator |  { 2026-04-07 00:41:26.267472 | orchestrator |  "data": "osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722", 2026-04-07 00:41:26.267478 | orchestrator |  "data_vg": "ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722" 2026-04-07 00:41:26.267485 | orchestrator |  } 2026-04-07 00:41:26.267491 | orchestrator |  ] 2026-04-07 00:41:26.267497 | orchestrator |  } 2026-04-07 00:41:26.267504 | orchestrator | } 2026-04-07 00:41:26.267532 | orchestrator | 2026-04-07 00:41:26.267540 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-04-07 00:41:26.267546 | orchestrator | Tuesday 07 April 2026 00:41:24 +0000 (0:00:00.175) 0:00:11.335 ********* 2026-04-07 00:41:26.267552 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-04-07 00:41:26.267558 | orchestrator | 2026-04-07 00:41:26.267565 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-04-07 00:41:26.267571 | orchestrator | 2026-04-07 00:41:26.267577 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-07 00:41:26.267583 | orchestrator | Tuesday 07 April 2026 00:41:25 +0000 (0:00:01.787) 0:00:13.123 ********* 2026-04-07 00:41:26.267590 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-04-07 00:41:26.267596 | orchestrator | 2026-04-07 00:41:26.267603 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-07 00:41:26.267613 | orchestrator | Tuesday 07 April 2026 00:41:26 +0000 (0:00:00.213) 0:00:13.337 ********* 2026-04-07 00:41:26.267620 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:41:26.267626 | orchestrator | 2026-04-07 00:41:26.267639 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.633846 | orchestrator | Tuesday 07 April 2026 00:41:26 +0000 (0:00:00.198) 0:00:13.535 ********* 2026-04-07 00:41:32.633959 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2026-04-07 00:41:32.633986 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2026-04-07 00:41:32.634006 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2026-04-07 00:41:32.634099 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2026-04-07 00:41:32.634124 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2026-04-07 00:41:32.634144 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2026-04-07 00:41:32.634206 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2026-04-07 00:41:32.634228 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2026-04-07 00:41:32.634253 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2026-04-07 00:41:32.634273 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2026-04-07 00:41:32.634293 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2026-04-07 00:41:32.634312 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2026-04-07 00:41:32.634331 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2026-04-07 00:41:32.634350 | orchestrator | 2026-04-07 00:41:32.634371 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.634391 | orchestrator | Tuesday 07 April 2026 00:41:26 +0000 (0:00:00.321) 0:00:13.857 ********* 2026-04-07 00:41:32.634413 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.634434 | orchestrator | 2026-04-07 00:41:32.634454 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.634470 | orchestrator | Tuesday 07 April 2026 00:41:26 +0000 (0:00:00.171) 0:00:14.028 ********* 2026-04-07 00:41:32.634482 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.634737 | orchestrator | 2026-04-07 00:41:32.634840 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.634868 | orchestrator | Tuesday 07 April 2026 00:41:26 +0000 (0:00:00.167) 0:00:14.196 ********* 2026-04-07 00:41:32.634887 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.634907 | orchestrator | 2026-04-07 00:41:32.634996 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635015 | orchestrator | Tuesday 07 April 2026 00:41:27 +0000 (0:00:00.167) 0:00:14.363 ********* 2026-04-07 00:41:32.635034 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.635054 | orchestrator | 2026-04-07 00:41:32.635073 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635093 | orchestrator | Tuesday 07 April 2026 00:41:27 +0000 (0:00:00.170) 0:00:14.534 ********* 2026-04-07 00:41:32.635111 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.635126 | orchestrator | 2026-04-07 00:41:32.635138 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635149 | orchestrator | Tuesday 07 April 2026 00:41:27 +0000 (0:00:00.168) 0:00:14.703 ********* 2026-04-07 00:41:32.635160 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.635170 | orchestrator | 2026-04-07 00:41:32.635185 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635204 | orchestrator | Tuesday 07 April 2026 00:41:27 +0000 (0:00:00.400) 0:00:15.103 ********* 2026-04-07 00:41:32.635224 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.635243 | orchestrator | 2026-04-07 00:41:32.635262 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635280 | orchestrator | Tuesday 07 April 2026 00:41:28 +0000 (0:00:00.171) 0:00:15.275 ********* 2026-04-07 00:41:32.635296 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.635307 | orchestrator | 2026-04-07 00:41:32.635318 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635329 | orchestrator | Tuesday 07 April 2026 00:41:28 +0000 (0:00:00.173) 0:00:15.449 ********* 2026-04-07 00:41:32.635386 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82) 2026-04-07 00:41:32.635408 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82) 2026-04-07 00:41:32.635426 | orchestrator | 2026-04-07 00:41:32.635445 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635482 | orchestrator | Tuesday 07 April 2026 00:41:28 +0000 (0:00:00.351) 0:00:15.801 ********* 2026-04-07 00:41:32.635495 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba) 2026-04-07 00:41:32.635525 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba) 2026-04-07 00:41:32.635536 | orchestrator | 2026-04-07 00:41:32.635547 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635559 | orchestrator | Tuesday 07 April 2026 00:41:28 +0000 (0:00:00.362) 0:00:16.163 ********* 2026-04-07 00:41:32.635570 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6) 2026-04-07 00:41:32.635581 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6) 2026-04-07 00:41:32.635592 | orchestrator | 2026-04-07 00:41:32.635603 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635635 | orchestrator | Tuesday 07 April 2026 00:41:29 +0000 (0:00:00.362) 0:00:16.525 ********* 2026-04-07 00:41:32.635647 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696) 2026-04-07 00:41:32.635658 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696) 2026-04-07 00:41:32.635669 | orchestrator | 2026-04-07 00:41:32.635680 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:32.635756 | orchestrator | Tuesday 07 April 2026 00:41:29 +0000 (0:00:00.377) 0:00:16.902 ********* 2026-04-07 00:41:32.635773 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-07 00:41:32.635792 | orchestrator | 2026-04-07 00:41:32.635811 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.635829 | orchestrator | Tuesday 07 April 2026 00:41:29 +0000 (0:00:00.295) 0:00:17.198 ********* 2026-04-07 00:41:32.635840 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2026-04-07 00:41:32.635851 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2026-04-07 00:41:32.635862 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2026-04-07 00:41:32.635873 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2026-04-07 00:41:32.635884 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2026-04-07 00:41:32.635895 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2026-04-07 00:41:32.635905 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2026-04-07 00:41:32.635916 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2026-04-07 00:41:32.635927 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2026-04-07 00:41:32.635937 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2026-04-07 00:41:32.635948 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2026-04-07 00:41:32.635959 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2026-04-07 00:41:32.635970 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2026-04-07 00:41:32.635980 | orchestrator | 2026-04-07 00:41:32.636000 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636020 | orchestrator | Tuesday 07 April 2026 00:41:30 +0000 (0:00:00.335) 0:00:17.533 ********* 2026-04-07 00:41:32.636034 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636052 | orchestrator | 2026-04-07 00:41:32.636071 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636089 | orchestrator | Tuesday 07 April 2026 00:41:30 +0000 (0:00:00.180) 0:00:17.714 ********* 2026-04-07 00:41:32.636105 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636116 | orchestrator | 2026-04-07 00:41:32.636127 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636138 | orchestrator | Tuesday 07 April 2026 00:41:30 +0000 (0:00:00.456) 0:00:18.171 ********* 2026-04-07 00:41:32.636149 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636160 | orchestrator | 2026-04-07 00:41:32.636170 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636228 | orchestrator | Tuesday 07 April 2026 00:41:31 +0000 (0:00:00.183) 0:00:18.354 ********* 2026-04-07 00:41:32.636246 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636260 | orchestrator | 2026-04-07 00:41:32.636278 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636297 | orchestrator | Tuesday 07 April 2026 00:41:31 +0000 (0:00:00.171) 0:00:18.526 ********* 2026-04-07 00:41:32.636315 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636326 | orchestrator | 2026-04-07 00:41:32.636337 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636348 | orchestrator | Tuesday 07 April 2026 00:41:31 +0000 (0:00:00.176) 0:00:18.702 ********* 2026-04-07 00:41:32.636359 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636370 | orchestrator | 2026-04-07 00:41:32.636381 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636407 | orchestrator | Tuesday 07 April 2026 00:41:31 +0000 (0:00:00.182) 0:00:18.885 ********* 2026-04-07 00:41:32.636419 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636430 | orchestrator | 2026-04-07 00:41:32.636441 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636452 | orchestrator | Tuesday 07 April 2026 00:41:31 +0000 (0:00:00.172) 0:00:19.057 ********* 2026-04-07 00:41:32.636463 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:32.636473 | orchestrator | 2026-04-07 00:41:32.636484 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636495 | orchestrator | Tuesday 07 April 2026 00:41:31 +0000 (0:00:00.179) 0:00:19.237 ********* 2026-04-07 00:41:32.636522 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2026-04-07 00:41:32.636539 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2026-04-07 00:41:32.636559 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2026-04-07 00:41:32.636588 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2026-04-07 00:41:32.636608 | orchestrator | 2026-04-07 00:41:32.636627 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:32.636645 | orchestrator | Tuesday 07 April 2026 00:41:32 +0000 (0:00:00.556) 0:00:19.793 ********* 2026-04-07 00:41:32.636663 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741093 | orchestrator | 2026-04-07 00:41:38.741249 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:38.741276 | orchestrator | Tuesday 07 April 2026 00:41:32 +0000 (0:00:00.188) 0:00:19.981 ********* 2026-04-07 00:41:38.741295 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741314 | orchestrator | 2026-04-07 00:41:38.741331 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:38.741347 | orchestrator | Tuesday 07 April 2026 00:41:32 +0000 (0:00:00.184) 0:00:20.166 ********* 2026-04-07 00:41:38.741364 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741381 | orchestrator | 2026-04-07 00:41:38.741397 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:38.741414 | orchestrator | Tuesday 07 April 2026 00:41:33 +0000 (0:00:00.199) 0:00:20.365 ********* 2026-04-07 00:41:38.741431 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741447 | orchestrator | 2026-04-07 00:41:38.741464 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-04-07 00:41:38.741481 | orchestrator | Tuesday 07 April 2026 00:41:33 +0000 (0:00:00.200) 0:00:20.566 ********* 2026-04-07 00:41:38.741498 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2026-04-07 00:41:38.741580 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2026-04-07 00:41:38.741598 | orchestrator | 2026-04-07 00:41:38.741616 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-04-07 00:41:38.741633 | orchestrator | Tuesday 07 April 2026 00:41:33 +0000 (0:00:00.338) 0:00:20.905 ********* 2026-04-07 00:41:38.741650 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741667 | orchestrator | 2026-04-07 00:41:38.741685 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-04-07 00:41:38.741702 | orchestrator | Tuesday 07 April 2026 00:41:33 +0000 (0:00:00.134) 0:00:21.040 ********* 2026-04-07 00:41:38.741726 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741746 | orchestrator | 2026-04-07 00:41:38.741765 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-04-07 00:41:38.741784 | orchestrator | Tuesday 07 April 2026 00:41:33 +0000 (0:00:00.125) 0:00:21.165 ********* 2026-04-07 00:41:38.741802 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.741819 | orchestrator | 2026-04-07 00:41:38.741838 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-04-07 00:41:38.741857 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.121) 0:00:21.286 ********* 2026-04-07 00:41:38.741875 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:41:38.741936 | orchestrator | 2026-04-07 00:41:38.741956 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-04-07 00:41:38.741974 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.118) 0:00:21.404 ********* 2026-04-07 00:41:38.741992 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f75c5f18-ff10-5900-9978-917c146f798b'}}) 2026-04-07 00:41:38.742012 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '47815a29-012a-570b-a074-b4436c47a2f4'}}) 2026-04-07 00:41:38.742101 | orchestrator | 2026-04-07 00:41:38.742120 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-04-07 00:41:38.742139 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.155) 0:00:21.560 ********* 2026-04-07 00:41:38.742205 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f75c5f18-ff10-5900-9978-917c146f798b'}})  2026-04-07 00:41:38.742226 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '47815a29-012a-570b-a074-b4436c47a2f4'}})  2026-04-07 00:41:38.742242 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.742259 | orchestrator | 2026-04-07 00:41:38.742276 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-04-07 00:41:38.742293 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.129) 0:00:21.689 ********* 2026-04-07 00:41:38.742309 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f75c5f18-ff10-5900-9978-917c146f798b'}})  2026-04-07 00:41:38.742326 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '47815a29-012a-570b-a074-b4436c47a2f4'}})  2026-04-07 00:41:38.742343 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.742361 | orchestrator | 2026-04-07 00:41:38.742379 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-04-07 00:41:38.742395 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.138) 0:00:21.828 ********* 2026-04-07 00:41:38.742412 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f75c5f18-ff10-5900-9978-917c146f798b'}})  2026-04-07 00:41:38.742428 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '47815a29-012a-570b-a074-b4436c47a2f4'}})  2026-04-07 00:41:38.742444 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.742462 | orchestrator | 2026-04-07 00:41:38.742480 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-04-07 00:41:38.742544 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.142) 0:00:21.971 ********* 2026-04-07 00:41:38.742565 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:41:38.742581 | orchestrator | 2026-04-07 00:41:38.742598 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-04-07 00:41:38.742614 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.135) 0:00:22.106 ********* 2026-04-07 00:41:38.742630 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:41:38.742646 | orchestrator | 2026-04-07 00:41:38.742662 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-04-07 00:41:38.742678 | orchestrator | Tuesday 07 April 2026 00:41:34 +0000 (0:00:00.137) 0:00:22.244 ********* 2026-04-07 00:41:38.742725 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.742743 | orchestrator | 2026-04-07 00:41:38.742760 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-04-07 00:41:38.742776 | orchestrator | Tuesday 07 April 2026 00:41:35 +0000 (0:00:00.124) 0:00:22.368 ********* 2026-04-07 00:41:38.742792 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.742808 | orchestrator | 2026-04-07 00:41:38.742821 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-04-07 00:41:38.742834 | orchestrator | Tuesday 07 April 2026 00:41:35 +0000 (0:00:00.292) 0:00:22.661 ********* 2026-04-07 00:41:38.742847 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.742860 | orchestrator | 2026-04-07 00:41:38.742874 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-04-07 00:41:38.742902 | orchestrator | Tuesday 07 April 2026 00:41:35 +0000 (0:00:00.130) 0:00:22.791 ********* 2026-04-07 00:41:38.742915 | orchestrator | ok: [testbed-node-4] => { 2026-04-07 00:41:38.742923 | orchestrator |  "ceph_osd_devices": { 2026-04-07 00:41:38.742933 | orchestrator |  "sdb": { 2026-04-07 00:41:38.742947 | orchestrator |  "osd_lvm_uuid": "f75c5f18-ff10-5900-9978-917c146f798b" 2026-04-07 00:41:38.742960 | orchestrator |  }, 2026-04-07 00:41:38.742973 | orchestrator |  "sdc": { 2026-04-07 00:41:38.742987 | orchestrator |  "osd_lvm_uuid": "47815a29-012a-570b-a074-b4436c47a2f4" 2026-04-07 00:41:38.743000 | orchestrator |  } 2026-04-07 00:41:38.743013 | orchestrator |  } 2026-04-07 00:41:38.743026 | orchestrator | } 2026-04-07 00:41:38.743040 | orchestrator | 2026-04-07 00:41:38.743054 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-04-07 00:41:38.743066 | orchestrator | Tuesday 07 April 2026 00:41:35 +0000 (0:00:00.137) 0:00:22.929 ********* 2026-04-07 00:41:38.743079 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.743092 | orchestrator | 2026-04-07 00:41:38.743106 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-04-07 00:41:38.743119 | orchestrator | Tuesday 07 April 2026 00:41:35 +0000 (0:00:00.116) 0:00:23.045 ********* 2026-04-07 00:41:38.743133 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.743146 | orchestrator | 2026-04-07 00:41:38.743159 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-04-07 00:41:38.743172 | orchestrator | Tuesday 07 April 2026 00:41:35 +0000 (0:00:00.129) 0:00:23.174 ********* 2026-04-07 00:41:38.743185 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:41:38.743198 | orchestrator | 2026-04-07 00:41:38.743211 | orchestrator | TASK [Print configuration data] ************************************************ 2026-04-07 00:41:38.743225 | orchestrator | Tuesday 07 April 2026 00:41:36 +0000 (0:00:00.133) 0:00:23.308 ********* 2026-04-07 00:41:38.743238 | orchestrator | changed: [testbed-node-4] => { 2026-04-07 00:41:38.743251 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-04-07 00:41:38.743265 | orchestrator |  "ceph_osd_devices": { 2026-04-07 00:41:38.743278 | orchestrator |  "sdb": { 2026-04-07 00:41:38.743291 | orchestrator |  "osd_lvm_uuid": "f75c5f18-ff10-5900-9978-917c146f798b" 2026-04-07 00:41:38.743305 | orchestrator |  }, 2026-04-07 00:41:38.743319 | orchestrator |  "sdc": { 2026-04-07 00:41:38.743332 | orchestrator |  "osd_lvm_uuid": "47815a29-012a-570b-a074-b4436c47a2f4" 2026-04-07 00:41:38.743345 | orchestrator |  } 2026-04-07 00:41:38.743359 | orchestrator |  }, 2026-04-07 00:41:38.743372 | orchestrator |  "lvm_volumes": [ 2026-04-07 00:41:38.743386 | orchestrator |  { 2026-04-07 00:41:38.743399 | orchestrator |  "data": "osd-block-f75c5f18-ff10-5900-9978-917c146f798b", 2026-04-07 00:41:38.743413 | orchestrator |  "data_vg": "ceph-f75c5f18-ff10-5900-9978-917c146f798b" 2026-04-07 00:41:38.743427 | orchestrator |  }, 2026-04-07 00:41:38.743440 | orchestrator |  { 2026-04-07 00:41:38.743454 | orchestrator |  "data": "osd-block-47815a29-012a-570b-a074-b4436c47a2f4", 2026-04-07 00:41:38.743467 | orchestrator |  "data_vg": "ceph-47815a29-012a-570b-a074-b4436c47a2f4" 2026-04-07 00:41:38.743481 | orchestrator |  } 2026-04-07 00:41:38.743494 | orchestrator |  ] 2026-04-07 00:41:38.743556 | orchestrator |  } 2026-04-07 00:41:38.743570 | orchestrator | } 2026-04-07 00:41:38.743584 | orchestrator | 2026-04-07 00:41:38.743598 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-04-07 00:41:38.743611 | orchestrator | Tuesday 07 April 2026 00:41:36 +0000 (0:00:00.228) 0:00:23.536 ********* 2026-04-07 00:41:38.743625 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-04-07 00:41:38.743638 | orchestrator | 2026-04-07 00:41:38.743651 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-04-07 00:41:38.743676 | orchestrator | 2026-04-07 00:41:38.743690 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-07 00:41:38.743704 | orchestrator | Tuesday 07 April 2026 00:41:37 +0000 (0:00:01.088) 0:00:24.625 ********* 2026-04-07 00:41:38.743718 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-04-07 00:41:38.743731 | orchestrator | 2026-04-07 00:41:38.743745 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-07 00:41:38.743758 | orchestrator | Tuesday 07 April 2026 00:41:37 +0000 (0:00:00.427) 0:00:25.052 ********* 2026-04-07 00:41:38.743772 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:41:38.743785 | orchestrator | 2026-04-07 00:41:38.743799 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:38.743812 | orchestrator | Tuesday 07 April 2026 00:41:38 +0000 (0:00:00.652) 0:00:25.704 ********* 2026-04-07 00:41:38.743825 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2026-04-07 00:41:38.743839 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2026-04-07 00:41:38.743852 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2026-04-07 00:41:38.743866 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2026-04-07 00:41:38.743879 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2026-04-07 00:41:38.743904 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2026-04-07 00:41:45.767198 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2026-04-07 00:41:45.767304 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2026-04-07 00:41:45.767313 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2026-04-07 00:41:45.767318 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2026-04-07 00:41:45.767322 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2026-04-07 00:41:45.767341 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2026-04-07 00:41:45.767346 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2026-04-07 00:41:45.767350 | orchestrator | 2026-04-07 00:41:45.767355 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767360 | orchestrator | Tuesday 07 April 2026 00:41:38 +0000 (0:00:00.387) 0:00:26.092 ********* 2026-04-07 00:41:45.767364 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767370 | orchestrator | 2026-04-07 00:41:45.767374 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767378 | orchestrator | Tuesday 07 April 2026 00:41:38 +0000 (0:00:00.180) 0:00:26.272 ********* 2026-04-07 00:41:45.767382 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767385 | orchestrator | 2026-04-07 00:41:45.767389 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767393 | orchestrator | Tuesday 07 April 2026 00:41:39 +0000 (0:00:00.175) 0:00:26.448 ********* 2026-04-07 00:41:45.767397 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767401 | orchestrator | 2026-04-07 00:41:45.767405 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767408 | orchestrator | Tuesday 07 April 2026 00:41:39 +0000 (0:00:00.174) 0:00:26.622 ********* 2026-04-07 00:41:45.767412 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767416 | orchestrator | 2026-04-07 00:41:45.767422 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767426 | orchestrator | Tuesday 07 April 2026 00:41:39 +0000 (0:00:00.187) 0:00:26.810 ********* 2026-04-07 00:41:45.767430 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767448 | orchestrator | 2026-04-07 00:41:45.767452 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767456 | orchestrator | Tuesday 07 April 2026 00:41:39 +0000 (0:00:00.204) 0:00:27.014 ********* 2026-04-07 00:41:45.767460 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767464 | orchestrator | 2026-04-07 00:41:45.767467 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767471 | orchestrator | Tuesday 07 April 2026 00:41:39 +0000 (0:00:00.188) 0:00:27.202 ********* 2026-04-07 00:41:45.767475 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767479 | orchestrator | 2026-04-07 00:41:45.767482 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767487 | orchestrator | Tuesday 07 April 2026 00:41:40 +0000 (0:00:00.194) 0:00:27.397 ********* 2026-04-07 00:41:45.767490 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767494 | orchestrator | 2026-04-07 00:41:45.767587 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767593 | orchestrator | Tuesday 07 April 2026 00:41:40 +0000 (0:00:00.164) 0:00:27.562 ********* 2026-04-07 00:41:45.767597 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9) 2026-04-07 00:41:45.767602 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9) 2026-04-07 00:41:45.767606 | orchestrator | 2026-04-07 00:41:45.767610 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767614 | orchestrator | Tuesday 07 April 2026 00:41:40 +0000 (0:00:00.484) 0:00:28.046 ********* 2026-04-07 00:41:45.767618 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd) 2026-04-07 00:41:45.767622 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd) 2026-04-07 00:41:45.767625 | orchestrator | 2026-04-07 00:41:45.767629 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767633 | orchestrator | Tuesday 07 April 2026 00:41:41 +0000 (0:00:00.618) 0:00:28.665 ********* 2026-04-07 00:41:45.767637 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349) 2026-04-07 00:41:45.767641 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349) 2026-04-07 00:41:45.767644 | orchestrator | 2026-04-07 00:41:45.767648 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767652 | orchestrator | Tuesday 07 April 2026 00:41:41 +0000 (0:00:00.377) 0:00:29.042 ********* 2026-04-07 00:41:45.767656 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8) 2026-04-07 00:41:45.767659 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8) 2026-04-07 00:41:45.767663 | orchestrator | 2026-04-07 00:41:45.767667 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:41:45.767671 | orchestrator | Tuesday 07 April 2026 00:41:42 +0000 (0:00:00.370) 0:00:29.412 ********* 2026-04-07 00:41:45.767675 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-07 00:41:45.767679 | orchestrator | 2026-04-07 00:41:45.767682 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767700 | orchestrator | Tuesday 07 April 2026 00:41:42 +0000 (0:00:00.296) 0:00:29.709 ********* 2026-04-07 00:41:45.767704 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2026-04-07 00:41:45.767708 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2026-04-07 00:41:45.767711 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2026-04-07 00:41:45.767716 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2026-04-07 00:41:45.767724 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2026-04-07 00:41:45.767728 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2026-04-07 00:41:45.767733 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2026-04-07 00:41:45.767737 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2026-04-07 00:41:45.767742 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2026-04-07 00:41:45.767746 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2026-04-07 00:41:45.767750 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2026-04-07 00:41:45.767755 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2026-04-07 00:41:45.767759 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2026-04-07 00:41:45.767763 | orchestrator | 2026-04-07 00:41:45.767768 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767772 | orchestrator | Tuesday 07 April 2026 00:41:42 +0000 (0:00:00.314) 0:00:30.023 ********* 2026-04-07 00:41:45.767776 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767781 | orchestrator | 2026-04-07 00:41:45.767785 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767790 | orchestrator | Tuesday 07 April 2026 00:41:42 +0000 (0:00:00.164) 0:00:30.188 ********* 2026-04-07 00:41:45.767794 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767798 | orchestrator | 2026-04-07 00:41:45.767803 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767807 | orchestrator | Tuesday 07 April 2026 00:41:43 +0000 (0:00:00.172) 0:00:30.360 ********* 2026-04-07 00:41:45.767812 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767816 | orchestrator | 2026-04-07 00:41:45.767820 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767825 | orchestrator | Tuesday 07 April 2026 00:41:43 +0000 (0:00:00.181) 0:00:30.541 ********* 2026-04-07 00:41:45.767829 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767833 | orchestrator | 2026-04-07 00:41:45.767841 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767846 | orchestrator | Tuesday 07 April 2026 00:41:43 +0000 (0:00:00.156) 0:00:30.698 ********* 2026-04-07 00:41:45.767850 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767854 | orchestrator | 2026-04-07 00:41:45.767859 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767863 | orchestrator | Tuesday 07 April 2026 00:41:43 +0000 (0:00:00.179) 0:00:30.877 ********* 2026-04-07 00:41:45.767868 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767872 | orchestrator | 2026-04-07 00:41:45.767876 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767881 | orchestrator | Tuesday 07 April 2026 00:41:44 +0000 (0:00:00.425) 0:00:31.302 ********* 2026-04-07 00:41:45.767885 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767889 | orchestrator | 2026-04-07 00:41:45.767894 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767899 | orchestrator | Tuesday 07 April 2026 00:41:44 +0000 (0:00:00.163) 0:00:31.466 ********* 2026-04-07 00:41:45.767903 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767907 | orchestrator | 2026-04-07 00:41:45.767911 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767915 | orchestrator | Tuesday 07 April 2026 00:41:44 +0000 (0:00:00.161) 0:00:31.627 ********* 2026-04-07 00:41:45.767919 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2026-04-07 00:41:45.767923 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2026-04-07 00:41:45.767930 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2026-04-07 00:41:45.767934 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2026-04-07 00:41:45.767937 | orchestrator | 2026-04-07 00:41:45.767941 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767945 | orchestrator | Tuesday 07 April 2026 00:41:44 +0000 (0:00:00.602) 0:00:32.230 ********* 2026-04-07 00:41:45.767949 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767952 | orchestrator | 2026-04-07 00:41:45.767956 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767960 | orchestrator | Tuesday 07 April 2026 00:41:45 +0000 (0:00:00.186) 0:00:32.416 ********* 2026-04-07 00:41:45.767964 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767967 | orchestrator | 2026-04-07 00:41:45.767971 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.767975 | orchestrator | Tuesday 07 April 2026 00:41:45 +0000 (0:00:00.211) 0:00:32.627 ********* 2026-04-07 00:41:45.767981 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.767988 | orchestrator | 2026-04-07 00:41:45.767994 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:41:45.768003 | orchestrator | Tuesday 07 April 2026 00:41:45 +0000 (0:00:00.202) 0:00:32.830 ********* 2026-04-07 00:41:45.768011 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:45.768017 | orchestrator | 2026-04-07 00:41:45.768027 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-04-07 00:41:49.612922 | orchestrator | Tuesday 07 April 2026 00:41:45 +0000 (0:00:00.204) 0:00:33.034 ********* 2026-04-07 00:41:49.613016 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2026-04-07 00:41:49.613030 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2026-04-07 00:41:49.613039 | orchestrator | 2026-04-07 00:41:49.613049 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-04-07 00:41:49.613059 | orchestrator | Tuesday 07 April 2026 00:41:45 +0000 (0:00:00.153) 0:00:33.188 ********* 2026-04-07 00:41:49.613068 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613078 | orchestrator | 2026-04-07 00:41:49.613088 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-04-07 00:41:49.613098 | orchestrator | Tuesday 07 April 2026 00:41:46 +0000 (0:00:00.122) 0:00:33.310 ********* 2026-04-07 00:41:49.613106 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613114 | orchestrator | 2026-04-07 00:41:49.613123 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-04-07 00:41:49.613132 | orchestrator | Tuesday 07 April 2026 00:41:46 +0000 (0:00:00.121) 0:00:33.431 ********* 2026-04-07 00:41:49.613142 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613151 | orchestrator | 2026-04-07 00:41:49.613160 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-04-07 00:41:49.613169 | orchestrator | Tuesday 07 April 2026 00:41:46 +0000 (0:00:00.122) 0:00:33.553 ********* 2026-04-07 00:41:49.613178 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:41:49.613188 | orchestrator | 2026-04-07 00:41:49.613198 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-04-07 00:41:49.613207 | orchestrator | Tuesday 07 April 2026 00:41:46 +0000 (0:00:00.289) 0:00:33.843 ********* 2026-04-07 00:41:49.613216 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0842dd12-8111-558f-8152-9e8987e1446c'}}) 2026-04-07 00:41:49.613226 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}}) 2026-04-07 00:41:49.613235 | orchestrator | 2026-04-07 00:41:49.613244 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-04-07 00:41:49.613252 | orchestrator | Tuesday 07 April 2026 00:41:46 +0000 (0:00:00.166) 0:00:34.010 ********* 2026-04-07 00:41:49.613261 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0842dd12-8111-558f-8152-9e8987e1446c'}})  2026-04-07 00:41:49.613297 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}})  2026-04-07 00:41:49.613307 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613317 | orchestrator | 2026-04-07 00:41:49.613326 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-04-07 00:41:49.613335 | orchestrator | Tuesday 07 April 2026 00:41:46 +0000 (0:00:00.145) 0:00:34.156 ********* 2026-04-07 00:41:49.613343 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0842dd12-8111-558f-8152-9e8987e1446c'}})  2026-04-07 00:41:49.613352 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}})  2026-04-07 00:41:49.613360 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613370 | orchestrator | 2026-04-07 00:41:49.613408 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-04-07 00:41:49.613418 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.132) 0:00:34.289 ********* 2026-04-07 00:41:49.613428 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0842dd12-8111-558f-8152-9e8987e1446c'}})  2026-04-07 00:41:49.613436 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}})  2026-04-07 00:41:49.613444 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613453 | orchestrator | 2026-04-07 00:41:49.613462 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-04-07 00:41:49.613472 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.136) 0:00:34.425 ********* 2026-04-07 00:41:49.613482 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:41:49.613491 | orchestrator | 2026-04-07 00:41:49.613525 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-04-07 00:41:49.613534 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.122) 0:00:34.548 ********* 2026-04-07 00:41:49.613543 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:41:49.613552 | orchestrator | 2026-04-07 00:41:49.613562 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-04-07 00:41:49.613571 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.121) 0:00:34.669 ********* 2026-04-07 00:41:49.613580 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613588 | orchestrator | 2026-04-07 00:41:49.613597 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-04-07 00:41:49.613605 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.122) 0:00:34.792 ********* 2026-04-07 00:41:49.613614 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613622 | orchestrator | 2026-04-07 00:41:49.613630 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-04-07 00:41:49.613639 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.124) 0:00:34.916 ********* 2026-04-07 00:41:49.613647 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613656 | orchestrator | 2026-04-07 00:41:49.613666 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-04-07 00:41:49.613676 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.129) 0:00:35.046 ********* 2026-04-07 00:41:49.613685 | orchestrator | ok: [testbed-node-5] => { 2026-04-07 00:41:49.613695 | orchestrator |  "ceph_osd_devices": { 2026-04-07 00:41:49.613701 | orchestrator |  "sdb": { 2026-04-07 00:41:49.613722 | orchestrator |  "osd_lvm_uuid": "0842dd12-8111-558f-8152-9e8987e1446c" 2026-04-07 00:41:49.613729 | orchestrator |  }, 2026-04-07 00:41:49.613735 | orchestrator |  "sdc": { 2026-04-07 00:41:49.613741 | orchestrator |  "osd_lvm_uuid": "e59b5a6a-4894-5883-a5b3-f677d5bde0c7" 2026-04-07 00:41:49.613747 | orchestrator |  } 2026-04-07 00:41:49.613754 | orchestrator |  } 2026-04-07 00:41:49.613762 | orchestrator | } 2026-04-07 00:41:49.613771 | orchestrator | 2026-04-07 00:41:49.613803 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-04-07 00:41:49.613829 | orchestrator | Tuesday 07 April 2026 00:41:47 +0000 (0:00:00.127) 0:00:35.174 ********* 2026-04-07 00:41:49.613839 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613848 | orchestrator | 2026-04-07 00:41:49.613856 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-04-07 00:41:49.613864 | orchestrator | Tuesday 07 April 2026 00:41:48 +0000 (0:00:00.124) 0:00:35.299 ********* 2026-04-07 00:41:49.613871 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613880 | orchestrator | 2026-04-07 00:41:49.613888 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-04-07 00:41:49.613898 | orchestrator | Tuesday 07 April 2026 00:41:48 +0000 (0:00:00.300) 0:00:35.599 ********* 2026-04-07 00:41:49.613906 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:41:49.613915 | orchestrator | 2026-04-07 00:41:49.613925 | orchestrator | TASK [Print configuration data] ************************************************ 2026-04-07 00:41:49.613936 | orchestrator | Tuesday 07 April 2026 00:41:48 +0000 (0:00:00.115) 0:00:35.715 ********* 2026-04-07 00:41:49.613942 | orchestrator | changed: [testbed-node-5] => { 2026-04-07 00:41:49.613947 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-04-07 00:41:49.613953 | orchestrator |  "ceph_osd_devices": { 2026-04-07 00:41:49.613959 | orchestrator |  "sdb": { 2026-04-07 00:41:49.613965 | orchestrator |  "osd_lvm_uuid": "0842dd12-8111-558f-8152-9e8987e1446c" 2026-04-07 00:41:49.613970 | orchestrator |  }, 2026-04-07 00:41:49.613975 | orchestrator |  "sdc": { 2026-04-07 00:41:49.613981 | orchestrator |  "osd_lvm_uuid": "e59b5a6a-4894-5883-a5b3-f677d5bde0c7" 2026-04-07 00:41:49.613986 | orchestrator |  } 2026-04-07 00:41:49.613995 | orchestrator |  }, 2026-04-07 00:41:49.614001 | orchestrator |  "lvm_volumes": [ 2026-04-07 00:41:49.614006 | orchestrator |  { 2026-04-07 00:41:49.614011 | orchestrator |  "data": "osd-block-0842dd12-8111-558f-8152-9e8987e1446c", 2026-04-07 00:41:49.614065 | orchestrator |  "data_vg": "ceph-0842dd12-8111-558f-8152-9e8987e1446c" 2026-04-07 00:41:49.614070 | orchestrator |  }, 2026-04-07 00:41:49.614075 | orchestrator |  { 2026-04-07 00:41:49.614084 | orchestrator |  "data": "osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7", 2026-04-07 00:41:49.614089 | orchestrator |  "data_vg": "ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7" 2026-04-07 00:41:49.614094 | orchestrator |  } 2026-04-07 00:41:49.614099 | orchestrator |  ] 2026-04-07 00:41:49.614104 | orchestrator |  } 2026-04-07 00:41:49.614109 | orchestrator | } 2026-04-07 00:41:49.614114 | orchestrator | 2026-04-07 00:41:49.614119 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-04-07 00:41:49.614124 | orchestrator | Tuesday 07 April 2026 00:41:48 +0000 (0:00:00.196) 0:00:35.911 ********* 2026-04-07 00:41:49.614129 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-04-07 00:41:49.614134 | orchestrator | 2026-04-07 00:41:49.614139 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:41:49.614144 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-04-07 00:41:49.614151 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-04-07 00:41:49.614156 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-04-07 00:41:49.614161 | orchestrator | 2026-04-07 00:41:49.614166 | orchestrator | 2026-04-07 00:41:49.614171 | orchestrator | 2026-04-07 00:41:49.614176 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:41:49.614181 | orchestrator | Tuesday 07 April 2026 00:41:49 +0000 (0:00:00.955) 0:00:36.867 ********* 2026-04-07 00:41:49.614186 | orchestrator | =============================================================================== 2026-04-07 00:41:49.614196 | orchestrator | Write configuration file ------------------------------------------------ 3.83s 2026-04-07 00:41:49.614202 | orchestrator | Get initial list of available block devices ----------------------------- 1.05s 2026-04-07 00:41:49.614206 | orchestrator | Add known partitions to the list of available block devices ------------- 1.01s 2026-04-07 00:41:49.614211 | orchestrator | Add known links to the list of available block devices ------------------ 1.01s 2026-04-07 00:41:49.614216 | orchestrator | Add known partitions to the list of available block devices ------------- 1.01s 2026-04-07 00:41:49.614221 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.86s 2026-04-07 00:41:49.614226 | orchestrator | Add known links to the list of available block devices ------------------ 0.72s 2026-04-07 00:41:49.614231 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.63s 2026-04-07 00:41:49.614236 | orchestrator | Add known links to the list of available block devices ------------------ 0.62s 2026-04-07 00:41:49.614241 | orchestrator | Add known partitions to the list of available block devices ------------- 0.60s 2026-04-07 00:41:49.614246 | orchestrator | Print configuration data ------------------------------------------------ 0.60s 2026-04-07 00:41:49.614251 | orchestrator | Generate lvm_volumes structure (block + db + wal) ----------------------- 0.59s 2026-04-07 00:41:49.614256 | orchestrator | Add known partitions to the list of available block devices ------------- 0.56s 2026-04-07 00:41:49.614268 | orchestrator | Set WAL devices config data --------------------------------------------- 0.54s 2026-04-07 00:41:49.900190 | orchestrator | Add known links to the list of available block devices ------------------ 0.54s 2026-04-07 00:41:49.900277 | orchestrator | Define lvm_volumes structures ------------------------------------------- 0.53s 2026-04-07 00:41:49.900284 | orchestrator | Print DB devices -------------------------------------------------------- 0.53s 2026-04-07 00:41:49.900290 | orchestrator | Add known links to the list of available block devices ------------------ 0.50s 2026-04-07 00:41:49.900295 | orchestrator | Generate lvm_volumes structure (block only) ----------------------------- 0.49s 2026-04-07 00:41:49.900299 | orchestrator | Add known links to the list of available block devices ------------------ 0.48s 2026-04-07 00:42:11.504813 | orchestrator | 2026-04-07 00:42:11 | INFO  | Task 64456b24-7311-4d88-bd4e-e2406bd8f7ac (sync inventory) is running in background. Output coming soon. 2026-04-07 00:42:39.396848 | orchestrator | 2026-04-07 00:42:12 | INFO  | Starting group_vars file reorganization 2026-04-07 00:42:39.396963 | orchestrator | 2026-04-07 00:42:12 | INFO  | Moved 0 file(s) to their respective directories 2026-04-07 00:42:39.396978 | orchestrator | 2026-04-07 00:42:12 | INFO  | Group_vars file reorganization completed 2026-04-07 00:42:39.396993 | orchestrator | 2026-04-07 00:42:15 | INFO  | Starting variable preparation from inventory 2026-04-07 00:42:39.397008 | orchestrator | 2026-04-07 00:42:18 | INFO  | Writing 050-kolla-ceph-rgw-hosts.yml with ceph_rgw_hosts 2026-04-07 00:42:39.397017 | orchestrator | 2026-04-07 00:42:18 | INFO  | Writing 050-infrastructure-cephclient-mons.yml with cephclient_mons 2026-04-07 00:42:39.397026 | orchestrator | 2026-04-07 00:42:18 | INFO  | Writing 050-ceph-cluster-fsid.yml with ceph_cluster_fsid 2026-04-07 00:42:39.397035 | orchestrator | 2026-04-07 00:42:18 | INFO  | 3 file(s) written, 6 host(s) processed 2026-04-07 00:42:39.397045 | orchestrator | 2026-04-07 00:42:18 | INFO  | Variable preparation completed 2026-04-07 00:42:39.397070 | orchestrator | 2026-04-07 00:42:19 | INFO  | Starting inventory overwrite handling 2026-04-07 00:42:39.397081 | orchestrator | 2026-04-07 00:42:19 | INFO  | Handling group overwrites in 99-overwrite 2026-04-07 00:42:39.397089 | orchestrator | 2026-04-07 00:42:19 | INFO  | Removing group frr:children from 60-generic 2026-04-07 00:42:39.397098 | orchestrator | 2026-04-07 00:42:19 | INFO  | Removing group netbird:children from 50-infrastructure 2026-04-07 00:42:39.397127 | orchestrator | 2026-04-07 00:42:19 | INFO  | Removing group ceph-rgw from 50-ceph 2026-04-07 00:42:39.397136 | orchestrator | 2026-04-07 00:42:19 | INFO  | Removing group ceph-mds from 50-ceph 2026-04-07 00:42:39.397145 | orchestrator | 2026-04-07 00:42:19 | INFO  | Handling group overwrites in 20-roles 2026-04-07 00:42:39.397153 | orchestrator | 2026-04-07 00:42:19 | INFO  | Removing group k3s_node from 50-infrastructure 2026-04-07 00:42:39.397161 | orchestrator | 2026-04-07 00:42:19 | INFO  | Removed 5 group(s) in total 2026-04-07 00:42:39.397169 | orchestrator | 2026-04-07 00:42:19 | INFO  | Inventory overwrite handling completed 2026-04-07 00:42:39.397177 | orchestrator | 2026-04-07 00:42:20 | INFO  | Starting merge of inventory files 2026-04-07 00:42:39.397185 | orchestrator | 2026-04-07 00:42:20 | INFO  | Inventory files merged successfully 2026-04-07 00:42:39.397193 | orchestrator | 2026-04-07 00:42:25 | INFO  | Generating minified hosts file 2026-04-07 00:42:39.397201 | orchestrator | 2026-04-07 00:42:27 | INFO  | Successfully wrote minified hosts file to /inventory.merge/hosts-minified.yml 2026-04-07 00:42:39.397210 | orchestrator | 2026-04-07 00:42:27 | INFO  | Successfully wrote fast inventory to /inventory.merge/fast/hosts.json 2026-04-07 00:42:39.397219 | orchestrator | 2026-04-07 00:42:28 | INFO  | Generating ClusterShell configuration from Ansible inventory 2026-04-07 00:42:39.397226 | orchestrator | 2026-04-07 00:42:38 | INFO  | Successfully wrote ClusterShell configuration 2026-04-07 00:42:39.397250 | orchestrator | [master 8d19257] 2026-04-07-00-42 2026-04-07 00:42:39.397260 | orchestrator | 5 files changed, 75 insertions(+), 10 deletions(-) 2026-04-07 00:42:39.397269 | orchestrator | create mode 100644 fast/host_vars/testbed-node-3/ceph-lvm-configuration.yml 2026-04-07 00:42:39.397278 | orchestrator | create mode 100644 fast/host_vars/testbed-node-4/ceph-lvm-configuration.yml 2026-04-07 00:42:39.397286 | orchestrator | create mode 100644 fast/host_vars/testbed-node-5/ceph-lvm-configuration.yml 2026-04-07 00:42:40.766148 | orchestrator | 2026-04-07 00:42:40 | INFO  | Prepare task for execution of ceph-create-lvm-devices. 2026-04-07 00:42:40.830008 | orchestrator | 2026-04-07 00:42:40 | INFO  | Task 24c5f7f3-1cbc-40d4-b62b-9f7215693cd8 (ceph-create-lvm-devices) was prepared for execution. 2026-04-07 00:42:40.830147 | orchestrator | 2026-04-07 00:42:40 | INFO  | It takes a moment until task 24c5f7f3-1cbc-40d4-b62b-9f7215693cd8 (ceph-create-lvm-devices) has been started and output is visible here. 2026-04-07 00:42:51.476038 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-07 00:42:51.476170 | orchestrator | 2.16.14 2026-04-07 00:42:51.476196 | orchestrator | 2026-04-07 00:42:51.476215 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-04-07 00:42:51.476234 | orchestrator | 2026-04-07 00:42:51.476251 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-07 00:42:51.476269 | orchestrator | Tuesday 07 April 2026 00:42:44 +0000 (0:00:00.240) 0:00:00.240 ********* 2026-04-07 00:42:51.476287 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-04-07 00:42:51.476305 | orchestrator | 2026-04-07 00:42:51.476321 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-07 00:42:51.476338 | orchestrator | Tuesday 07 April 2026 00:42:44 +0000 (0:00:00.210) 0:00:00.450 ********* 2026-04-07 00:42:51.476355 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:42:51.476372 | orchestrator | 2026-04-07 00:42:51.476388 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.476405 | orchestrator | Tuesday 07 April 2026 00:42:45 +0000 (0:00:00.188) 0:00:00.639 ********* 2026-04-07 00:42:51.476422 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2026-04-07 00:42:51.476503 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2026-04-07 00:42:51.476524 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2026-04-07 00:42:51.476542 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2026-04-07 00:42:51.476559 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2026-04-07 00:42:51.476577 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2026-04-07 00:42:51.476614 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2026-04-07 00:42:51.476632 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2026-04-07 00:42:51.476650 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2026-04-07 00:42:51.476669 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2026-04-07 00:42:51.476686 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2026-04-07 00:42:51.476703 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2026-04-07 00:42:51.476720 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2026-04-07 00:42:51.476737 | orchestrator | 2026-04-07 00:42:51.476756 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.476774 | orchestrator | Tuesday 07 April 2026 00:42:45 +0000 (0:00:00.341) 0:00:00.980 ********* 2026-04-07 00:42:51.476791 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.476809 | orchestrator | 2026-04-07 00:42:51.476827 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.476846 | orchestrator | Tuesday 07 April 2026 00:42:45 +0000 (0:00:00.351) 0:00:01.332 ********* 2026-04-07 00:42:51.476863 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.476881 | orchestrator | 2026-04-07 00:42:51.476898 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.476916 | orchestrator | Tuesday 07 April 2026 00:42:46 +0000 (0:00:00.161) 0:00:01.494 ********* 2026-04-07 00:42:51.476934 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.476951 | orchestrator | 2026-04-07 00:42:51.476969 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.476987 | orchestrator | Tuesday 07 April 2026 00:42:46 +0000 (0:00:00.168) 0:00:01.662 ********* 2026-04-07 00:42:51.477004 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.477021 | orchestrator | 2026-04-07 00:42:51.477037 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477054 | orchestrator | Tuesday 07 April 2026 00:42:46 +0000 (0:00:00.169) 0:00:01.831 ********* 2026-04-07 00:42:51.477071 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.477088 | orchestrator | 2026-04-07 00:42:51.477105 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477122 | orchestrator | Tuesday 07 April 2026 00:42:46 +0000 (0:00:00.178) 0:00:02.009 ********* 2026-04-07 00:42:51.477139 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.477155 | orchestrator | 2026-04-07 00:42:51.477172 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477190 | orchestrator | Tuesday 07 April 2026 00:42:46 +0000 (0:00:00.177) 0:00:02.187 ********* 2026-04-07 00:42:51.477206 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.477223 | orchestrator | 2026-04-07 00:42:51.477241 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477259 | orchestrator | Tuesday 07 April 2026 00:42:46 +0000 (0:00:00.226) 0:00:02.414 ********* 2026-04-07 00:42:51.477277 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.477295 | orchestrator | 2026-04-07 00:42:51.477312 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477341 | orchestrator | Tuesday 07 April 2026 00:42:47 +0000 (0:00:00.178) 0:00:02.593 ********* 2026-04-07 00:42:51.477359 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967) 2026-04-07 00:42:51.477378 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967) 2026-04-07 00:42:51.477395 | orchestrator | 2026-04-07 00:42:51.477412 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477453 | orchestrator | Tuesday 07 April 2026 00:42:47 +0000 (0:00:00.345) 0:00:02.938 ********* 2026-04-07 00:42:51.477502 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76) 2026-04-07 00:42:51.477519 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76) 2026-04-07 00:42:51.477536 | orchestrator | 2026-04-07 00:42:51.477552 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477568 | orchestrator | Tuesday 07 April 2026 00:42:47 +0000 (0:00:00.371) 0:00:03.310 ********* 2026-04-07 00:42:51.477584 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88) 2026-04-07 00:42:51.477600 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88) 2026-04-07 00:42:51.477617 | orchestrator | 2026-04-07 00:42:51.477634 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477650 | orchestrator | Tuesday 07 April 2026 00:42:48 +0000 (0:00:00.504) 0:00:03.814 ********* 2026-04-07 00:42:51.477666 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d) 2026-04-07 00:42:51.477681 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d) 2026-04-07 00:42:51.477698 | orchestrator | 2026-04-07 00:42:51.477714 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:42:51.477730 | orchestrator | Tuesday 07 April 2026 00:42:48 +0000 (0:00:00.575) 0:00:04.390 ********* 2026-04-07 00:42:51.477747 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-07 00:42:51.477762 | orchestrator | 2026-04-07 00:42:51.477776 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.477791 | orchestrator | Tuesday 07 April 2026 00:42:49 +0000 (0:00:00.710) 0:00:05.101 ********* 2026-04-07 00:42:51.477807 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2026-04-07 00:42:51.477825 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2026-04-07 00:42:51.477841 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2026-04-07 00:42:51.477858 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2026-04-07 00:42:51.477874 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2026-04-07 00:42:51.477891 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2026-04-07 00:42:51.477908 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2026-04-07 00:42:51.477924 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2026-04-07 00:42:51.477940 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2026-04-07 00:42:51.477956 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2026-04-07 00:42:51.477972 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2026-04-07 00:42:51.477988 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2026-04-07 00:42:51.478103 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2026-04-07 00:42:51.478125 | orchestrator | 2026-04-07 00:42:51.478142 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478158 | orchestrator | Tuesday 07 April 2026 00:42:50 +0000 (0:00:00.409) 0:00:05.510 ********* 2026-04-07 00:42:51.478174 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478191 | orchestrator | 2026-04-07 00:42:51.478208 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478225 | orchestrator | Tuesday 07 April 2026 00:42:50 +0000 (0:00:00.256) 0:00:05.767 ********* 2026-04-07 00:42:51.478242 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478258 | orchestrator | 2026-04-07 00:42:51.478269 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478279 | orchestrator | Tuesday 07 April 2026 00:42:50 +0000 (0:00:00.205) 0:00:05.972 ********* 2026-04-07 00:42:51.478289 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478299 | orchestrator | 2026-04-07 00:42:51.478321 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478331 | orchestrator | Tuesday 07 April 2026 00:42:50 +0000 (0:00:00.179) 0:00:06.152 ********* 2026-04-07 00:42:51.478341 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478351 | orchestrator | 2026-04-07 00:42:51.478361 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478371 | orchestrator | Tuesday 07 April 2026 00:42:50 +0000 (0:00:00.210) 0:00:06.363 ********* 2026-04-07 00:42:51.478381 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478390 | orchestrator | 2026-04-07 00:42:51.478400 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478410 | orchestrator | Tuesday 07 April 2026 00:42:51 +0000 (0:00:00.196) 0:00:06.560 ********* 2026-04-07 00:42:51.478420 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478430 | orchestrator | 2026-04-07 00:42:51.478440 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:51.478450 | orchestrator | Tuesday 07 April 2026 00:42:51 +0000 (0:00:00.184) 0:00:06.744 ********* 2026-04-07 00:42:51.478570 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:51.478587 | orchestrator | 2026-04-07 00:42:51.478616 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:58.964823 | orchestrator | Tuesday 07 April 2026 00:42:51 +0000 (0:00:00.186) 0:00:06.931 ********* 2026-04-07 00:42:58.964925 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.964939 | orchestrator | 2026-04-07 00:42:58.964950 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:58.964960 | orchestrator | Tuesday 07 April 2026 00:42:51 +0000 (0:00:00.179) 0:00:07.111 ********* 2026-04-07 00:42:58.964969 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2026-04-07 00:42:58.964979 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2026-04-07 00:42:58.964989 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2026-04-07 00:42:58.964998 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2026-04-07 00:42:58.965007 | orchestrator | 2026-04-07 00:42:58.965017 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:58.965026 | orchestrator | Tuesday 07 April 2026 00:42:52 +0000 (0:00:00.943) 0:00:08.054 ********* 2026-04-07 00:42:58.965035 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965044 | orchestrator | 2026-04-07 00:42:58.965053 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:58.965062 | orchestrator | Tuesday 07 April 2026 00:42:52 +0000 (0:00:00.197) 0:00:08.252 ********* 2026-04-07 00:42:58.965071 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965080 | orchestrator | 2026-04-07 00:42:58.965089 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:58.965098 | orchestrator | Tuesday 07 April 2026 00:42:52 +0000 (0:00:00.190) 0:00:08.442 ********* 2026-04-07 00:42:58.965131 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965141 | orchestrator | 2026-04-07 00:42:58.965150 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:42:58.965159 | orchestrator | Tuesday 07 April 2026 00:42:53 +0000 (0:00:00.173) 0:00:08.615 ********* 2026-04-07 00:42:58.965168 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965177 | orchestrator | 2026-04-07 00:42:58.965185 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-04-07 00:42:58.965206 | orchestrator | Tuesday 07 April 2026 00:42:53 +0000 (0:00:00.197) 0:00:08.813 ********* 2026-04-07 00:42:58.965215 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965224 | orchestrator | 2026-04-07 00:42:58.965233 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-04-07 00:42:58.965242 | orchestrator | Tuesday 07 April 2026 00:42:53 +0000 (0:00:00.132) 0:00:08.945 ********* 2026-04-07 00:42:58.965252 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e0113da9-ca02-59fe-bdca-d5482abf5fe2'}}) 2026-04-07 00:42:58.965260 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9eeb51fd-cca7-5129-bb0c-15bc93c67722'}}) 2026-04-07 00:42:58.965269 | orchestrator | 2026-04-07 00:42:58.965278 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-04-07 00:42:58.965287 | orchestrator | Tuesday 07 April 2026 00:42:53 +0000 (0:00:00.185) 0:00:09.130 ********* 2026-04-07 00:42:58.965297 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'}) 2026-04-07 00:42:58.965307 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'}) 2026-04-07 00:42:58.965316 | orchestrator | 2026-04-07 00:42:58.965324 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-04-07 00:42:58.965334 | orchestrator | Tuesday 07 April 2026 00:42:55 +0000 (0:00:01.971) 0:00:11.102 ********* 2026-04-07 00:42:58.965343 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.965353 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.965364 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965375 | orchestrator | 2026-04-07 00:42:58.965385 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-04-07 00:42:58.965396 | orchestrator | Tuesday 07 April 2026 00:42:55 +0000 (0:00:00.155) 0:00:11.257 ********* 2026-04-07 00:42:58.965407 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'}) 2026-04-07 00:42:58.965417 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'}) 2026-04-07 00:42:58.965427 | orchestrator | 2026-04-07 00:42:58.965437 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-04-07 00:42:58.965447 | orchestrator | Tuesday 07 April 2026 00:42:57 +0000 (0:00:01.393) 0:00:12.651 ********* 2026-04-07 00:42:58.965484 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.965499 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.965513 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965528 | orchestrator | 2026-04-07 00:42:58.965546 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-04-07 00:42:58.965577 | orchestrator | Tuesday 07 April 2026 00:42:57 +0000 (0:00:00.145) 0:00:12.796 ********* 2026-04-07 00:42:58.965613 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965627 | orchestrator | 2026-04-07 00:42:58.965642 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-04-07 00:42:58.965657 | orchestrator | Tuesday 07 April 2026 00:42:57 +0000 (0:00:00.132) 0:00:12.929 ********* 2026-04-07 00:42:58.965671 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.965686 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.965702 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965715 | orchestrator | 2026-04-07 00:42:58.965730 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-04-07 00:42:58.965741 | orchestrator | Tuesday 07 April 2026 00:42:57 +0000 (0:00:00.313) 0:00:13.242 ********* 2026-04-07 00:42:58.965750 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965759 | orchestrator | 2026-04-07 00:42:58.965767 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-04-07 00:42:58.965776 | orchestrator | Tuesday 07 April 2026 00:42:57 +0000 (0:00:00.129) 0:00:13.372 ********* 2026-04-07 00:42:58.965785 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.965794 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.965803 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965812 | orchestrator | 2026-04-07 00:42:58.965821 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-04-07 00:42:58.965830 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.133) 0:00:13.506 ********* 2026-04-07 00:42:58.965839 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965848 | orchestrator | 2026-04-07 00:42:58.965857 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-04-07 00:42:58.965866 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.122) 0:00:13.628 ********* 2026-04-07 00:42:58.965875 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.965884 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.965892 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965901 | orchestrator | 2026-04-07 00:42:58.965910 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-04-07 00:42:58.965919 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.134) 0:00:13.763 ********* 2026-04-07 00:42:58.965928 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:42:58.965937 | orchestrator | 2026-04-07 00:42:58.965946 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-04-07 00:42:58.965954 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.125) 0:00:13.888 ********* 2026-04-07 00:42:58.965963 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.965972 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.965981 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.965990 | orchestrator | 2026-04-07 00:42:58.965999 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-04-07 00:42:58.966008 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.135) 0:00:14.024 ********* 2026-04-07 00:42:58.966081 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.966093 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.966102 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.966110 | orchestrator | 2026-04-07 00:42:58.966119 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-04-07 00:42:58.966128 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.122) 0:00:14.146 ********* 2026-04-07 00:42:58.966137 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:42:58.966145 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:42:58.966154 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.966163 | orchestrator | 2026-04-07 00:42:58.966172 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-04-07 00:42:58.966180 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.153) 0:00:14.299 ********* 2026-04-07 00:42:58.966189 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:42:58.966198 | orchestrator | 2026-04-07 00:42:58.966207 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-04-07 00:42:58.966223 | orchestrator | Tuesday 07 April 2026 00:42:58 +0000 (0:00:00.122) 0:00:14.422 ********* 2026-04-07 00:43:04.371328 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.371421 | orchestrator | 2026-04-07 00:43:04.371436 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-04-07 00:43:04.371475 | orchestrator | Tuesday 07 April 2026 00:42:59 +0000 (0:00:00.129) 0:00:14.551 ********* 2026-04-07 00:43:04.371486 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.371495 | orchestrator | 2026-04-07 00:43:04.371505 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-04-07 00:43:04.371514 | orchestrator | Tuesday 07 April 2026 00:42:59 +0000 (0:00:00.125) 0:00:14.677 ********* 2026-04-07 00:43:04.371524 | orchestrator | ok: [testbed-node-3] => { 2026-04-07 00:43:04.371533 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-04-07 00:43:04.371543 | orchestrator | } 2026-04-07 00:43:04.371553 | orchestrator | 2026-04-07 00:43:04.371562 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-04-07 00:43:04.371571 | orchestrator | Tuesday 07 April 2026 00:42:59 +0000 (0:00:00.244) 0:00:14.921 ********* 2026-04-07 00:43:04.371580 | orchestrator | ok: [testbed-node-3] => { 2026-04-07 00:43:04.371588 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-04-07 00:43:04.371597 | orchestrator | } 2026-04-07 00:43:04.371606 | orchestrator | 2026-04-07 00:43:04.371615 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-04-07 00:43:04.371624 | orchestrator | Tuesday 07 April 2026 00:42:59 +0000 (0:00:00.122) 0:00:15.043 ********* 2026-04-07 00:43:04.371633 | orchestrator | ok: [testbed-node-3] => { 2026-04-07 00:43:04.371642 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-04-07 00:43:04.371651 | orchestrator | } 2026-04-07 00:43:04.371660 | orchestrator | 2026-04-07 00:43:04.371669 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-04-07 00:43:04.371678 | orchestrator | Tuesday 07 April 2026 00:42:59 +0000 (0:00:00.138) 0:00:15.181 ********* 2026-04-07 00:43:04.371687 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:04.371695 | orchestrator | 2026-04-07 00:43:04.371710 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-04-07 00:43:04.371719 | orchestrator | Tuesday 07 April 2026 00:43:00 +0000 (0:00:00.603) 0:00:15.785 ********* 2026-04-07 00:43:04.371728 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:04.371758 | orchestrator | 2026-04-07 00:43:04.371768 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-04-07 00:43:04.371777 | orchestrator | Tuesday 07 April 2026 00:43:00 +0000 (0:00:00.478) 0:00:16.263 ********* 2026-04-07 00:43:04.371786 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:04.371795 | orchestrator | 2026-04-07 00:43:04.371804 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-04-07 00:43:04.371812 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.474) 0:00:16.738 ********* 2026-04-07 00:43:04.371821 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:04.371830 | orchestrator | 2026-04-07 00:43:04.371839 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-04-07 00:43:04.371848 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.118) 0:00:16.856 ********* 2026-04-07 00:43:04.371857 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.371867 | orchestrator | 2026-04-07 00:43:04.371877 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-04-07 00:43:04.371888 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.084) 0:00:16.940 ********* 2026-04-07 00:43:04.371898 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.371908 | orchestrator | 2026-04-07 00:43:04.371919 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-04-07 00:43:04.371929 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.112) 0:00:17.052 ********* 2026-04-07 00:43:04.371939 | orchestrator | ok: [testbed-node-3] => { 2026-04-07 00:43:04.371949 | orchestrator |  "vgs_report": { 2026-04-07 00:43:04.371960 | orchestrator |  "vg": [] 2026-04-07 00:43:04.371970 | orchestrator |  } 2026-04-07 00:43:04.371981 | orchestrator | } 2026-04-07 00:43:04.371991 | orchestrator | 2026-04-07 00:43:04.372001 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-04-07 00:43:04.372011 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.126) 0:00:17.179 ********* 2026-04-07 00:43:04.372022 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372032 | orchestrator | 2026-04-07 00:43:04.372042 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-04-07 00:43:04.372052 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.110) 0:00:17.290 ********* 2026-04-07 00:43:04.372063 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372073 | orchestrator | 2026-04-07 00:43:04.372083 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-04-07 00:43:04.372093 | orchestrator | Tuesday 07 April 2026 00:43:01 +0000 (0:00:00.112) 0:00:17.403 ********* 2026-04-07 00:43:04.372104 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372114 | orchestrator | 2026-04-07 00:43:04.372124 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-04-07 00:43:04.372134 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.104) 0:00:17.507 ********* 2026-04-07 00:43:04.372144 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372154 | orchestrator | 2026-04-07 00:43:04.372165 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-04-07 00:43:04.372175 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.241) 0:00:17.749 ********* 2026-04-07 00:43:04.372185 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372195 | orchestrator | 2026-04-07 00:43:04.372205 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-04-07 00:43:04.372218 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.120) 0:00:17.870 ********* 2026-04-07 00:43:04.372233 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372248 | orchestrator | 2026-04-07 00:43:04.372262 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-04-07 00:43:04.372276 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.120) 0:00:17.990 ********* 2026-04-07 00:43:04.372289 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372303 | orchestrator | 2026-04-07 00:43:04.372316 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-04-07 00:43:04.372338 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.113) 0:00:18.103 ********* 2026-04-07 00:43:04.372370 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372384 | orchestrator | 2026-04-07 00:43:04.372398 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-04-07 00:43:04.372412 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.122) 0:00:18.226 ********* 2026-04-07 00:43:04.372426 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372440 | orchestrator | 2026-04-07 00:43:04.372477 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-04-07 00:43:04.372491 | orchestrator | Tuesday 07 April 2026 00:43:02 +0000 (0:00:00.125) 0:00:18.352 ********* 2026-04-07 00:43:04.372505 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372518 | orchestrator | 2026-04-07 00:43:04.372533 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-04-07 00:43:04.372548 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.119) 0:00:18.472 ********* 2026-04-07 00:43:04.372562 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372576 | orchestrator | 2026-04-07 00:43:04.372592 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-04-07 00:43:04.372607 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.123) 0:00:18.595 ********* 2026-04-07 00:43:04.372620 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372635 | orchestrator | 2026-04-07 00:43:04.372649 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-04-07 00:43:04.372663 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.125) 0:00:18.721 ********* 2026-04-07 00:43:04.372678 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372692 | orchestrator | 2026-04-07 00:43:04.372706 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-04-07 00:43:04.372722 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.126) 0:00:18.847 ********* 2026-04-07 00:43:04.372737 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372751 | orchestrator | 2026-04-07 00:43:04.372774 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-04-07 00:43:04.372784 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.117) 0:00:18.965 ********* 2026-04-07 00:43:04.372794 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:04.372804 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:04.372813 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372822 | orchestrator | 2026-04-07 00:43:04.372835 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-04-07 00:43:04.372856 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.133) 0:00:19.098 ********* 2026-04-07 00:43:04.372874 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:04.372888 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:04.372901 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.372915 | orchestrator | 2026-04-07 00:43:04.372927 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-04-07 00:43:04.372941 | orchestrator | Tuesday 07 April 2026 00:43:03 +0000 (0:00:00.275) 0:00:19.374 ********* 2026-04-07 00:43:04.372955 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:04.372968 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:04.372996 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.373010 | orchestrator | 2026-04-07 00:43:04.373025 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-04-07 00:43:04.373040 | orchestrator | Tuesday 07 April 2026 00:43:04 +0000 (0:00:00.133) 0:00:19.508 ********* 2026-04-07 00:43:04.373055 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:04.373070 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:04.373084 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.373097 | orchestrator | 2026-04-07 00:43:04.373112 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-04-07 00:43:04.373129 | orchestrator | Tuesday 07 April 2026 00:43:04 +0000 (0:00:00.136) 0:00:19.644 ********* 2026-04-07 00:43:04.373142 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:04.373155 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:04.373169 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:04.373182 | orchestrator | 2026-04-07 00:43:04.373196 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-04-07 00:43:04.373211 | orchestrator | Tuesday 07 April 2026 00:43:04 +0000 (0:00:00.133) 0:00:19.777 ********* 2026-04-07 00:43:04.373246 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:09.262397 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:09.262629 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:09.262651 | orchestrator | 2026-04-07 00:43:09.262665 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-04-07 00:43:09.262679 | orchestrator | Tuesday 07 April 2026 00:43:04 +0000 (0:00:00.134) 0:00:19.912 ********* 2026-04-07 00:43:09.262690 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:09.262702 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:09.262713 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:09.262724 | orchestrator | 2026-04-07 00:43:09.262736 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-04-07 00:43:09.262747 | orchestrator | Tuesday 07 April 2026 00:43:04 +0000 (0:00:00.154) 0:00:20.066 ********* 2026-04-07 00:43:09.262758 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:09.262770 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:09.262781 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:09.262792 | orchestrator | 2026-04-07 00:43:09.262803 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-04-07 00:43:09.262814 | orchestrator | Tuesday 07 April 2026 00:43:04 +0000 (0:00:00.172) 0:00:20.239 ********* 2026-04-07 00:43:09.262826 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:09.262837 | orchestrator | 2026-04-07 00:43:09.262849 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-04-07 00:43:09.262887 | orchestrator | Tuesday 07 April 2026 00:43:05 +0000 (0:00:00.502) 0:00:20.742 ********* 2026-04-07 00:43:09.262901 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:09.262915 | orchestrator | 2026-04-07 00:43:09.262929 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-04-07 00:43:09.263013 | orchestrator | Tuesday 07 April 2026 00:43:05 +0000 (0:00:00.540) 0:00:21.283 ********* 2026-04-07 00:43:09.263027 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:43:09.263040 | orchestrator | 2026-04-07 00:43:09.263054 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-04-07 00:43:09.263085 | orchestrator | Tuesday 07 April 2026 00:43:05 +0000 (0:00:00.136) 0:00:21.419 ********* 2026-04-07 00:43:09.263099 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'vg_name': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'}) 2026-04-07 00:43:09.263114 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'vg_name': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'}) 2026-04-07 00:43:09.263127 | orchestrator | 2026-04-07 00:43:09.263140 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-04-07 00:43:09.263154 | orchestrator | Tuesday 07 April 2026 00:43:06 +0000 (0:00:00.153) 0:00:21.573 ********* 2026-04-07 00:43:09.263167 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:09.263180 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:09.263193 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:09.263206 | orchestrator | 2026-04-07 00:43:09.263219 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-04-07 00:43:09.263233 | orchestrator | Tuesday 07 April 2026 00:43:06 +0000 (0:00:00.142) 0:00:21.715 ********* 2026-04-07 00:43:09.263246 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:09.263260 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:09.263273 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:09.263286 | orchestrator | 2026-04-07 00:43:09.263299 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-04-07 00:43:09.263310 | orchestrator | Tuesday 07 April 2026 00:43:06 +0000 (0:00:00.324) 0:00:22.039 ********* 2026-04-07 00:43:09.263328 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'})  2026-04-07 00:43:09.263347 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'})  2026-04-07 00:43:09.263364 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:43:09.263378 | orchestrator | 2026-04-07 00:43:09.263398 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-04-07 00:43:09.263418 | orchestrator | Tuesday 07 April 2026 00:43:06 +0000 (0:00:00.140) 0:00:22.180 ********* 2026-04-07 00:43:09.263476 | orchestrator | ok: [testbed-node-3] => { 2026-04-07 00:43:09.263489 | orchestrator |  "lvm_report": { 2026-04-07 00:43:09.263501 | orchestrator |  "lv": [ 2026-04-07 00:43:09.263512 | orchestrator |  { 2026-04-07 00:43:09.263524 | orchestrator |  "lv_name": "osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722", 2026-04-07 00:43:09.263535 | orchestrator |  "vg_name": "ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722" 2026-04-07 00:43:09.263546 | orchestrator |  }, 2026-04-07 00:43:09.263557 | orchestrator |  { 2026-04-07 00:43:09.263569 | orchestrator |  "lv_name": "osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2", 2026-04-07 00:43:09.263592 | orchestrator |  "vg_name": "ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2" 2026-04-07 00:43:09.263617 | orchestrator |  } 2026-04-07 00:43:09.263628 | orchestrator |  ], 2026-04-07 00:43:09.263664 | orchestrator |  "pv": [ 2026-04-07 00:43:09.263685 | orchestrator |  { 2026-04-07 00:43:09.263697 | orchestrator |  "pv_name": "/dev/sdb", 2026-04-07 00:43:09.263708 | orchestrator |  "vg_name": "ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2" 2026-04-07 00:43:09.263719 | orchestrator |  }, 2026-04-07 00:43:09.263730 | orchestrator |  { 2026-04-07 00:43:09.263741 | orchestrator |  "pv_name": "/dev/sdc", 2026-04-07 00:43:09.263757 | orchestrator |  "vg_name": "ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722" 2026-04-07 00:43:09.263777 | orchestrator |  } 2026-04-07 00:43:09.263795 | orchestrator |  ] 2026-04-07 00:43:09.263814 | orchestrator |  } 2026-04-07 00:43:09.263834 | orchestrator | } 2026-04-07 00:43:09.263853 | orchestrator | 2026-04-07 00:43:09.263873 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-04-07 00:43:09.263893 | orchestrator | 2026-04-07 00:43:09.263913 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-07 00:43:09.263942 | orchestrator | Tuesday 07 April 2026 00:43:06 +0000 (0:00:00.263) 0:00:22.444 ********* 2026-04-07 00:43:09.263964 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-04-07 00:43:09.263985 | orchestrator | 2026-04-07 00:43:09.264004 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-07 00:43:09.264022 | orchestrator | Tuesday 07 April 2026 00:43:07 +0000 (0:00:00.237) 0:00:22.681 ********* 2026-04-07 00:43:09.264033 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:09.264044 | orchestrator | 2026-04-07 00:43:09.264056 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264067 | orchestrator | Tuesday 07 April 2026 00:43:07 +0000 (0:00:00.210) 0:00:22.891 ********* 2026-04-07 00:43:09.264078 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2026-04-07 00:43:09.264089 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2026-04-07 00:43:09.264100 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2026-04-07 00:43:09.264111 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2026-04-07 00:43:09.264122 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2026-04-07 00:43:09.264133 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2026-04-07 00:43:09.264144 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2026-04-07 00:43:09.264155 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2026-04-07 00:43:09.264166 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2026-04-07 00:43:09.264177 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2026-04-07 00:43:09.264187 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2026-04-07 00:43:09.264198 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2026-04-07 00:43:09.264209 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2026-04-07 00:43:09.264220 | orchestrator | 2026-04-07 00:43:09.264231 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264242 | orchestrator | Tuesday 07 April 2026 00:43:07 +0000 (0:00:00.390) 0:00:23.281 ********* 2026-04-07 00:43:09.264253 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:09.264264 | orchestrator | 2026-04-07 00:43:09.264275 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264298 | orchestrator | Tuesday 07 April 2026 00:43:07 +0000 (0:00:00.179) 0:00:23.461 ********* 2026-04-07 00:43:09.264309 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:09.264320 | orchestrator | 2026-04-07 00:43:09.264331 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264342 | orchestrator | Tuesday 07 April 2026 00:43:08 +0000 (0:00:00.179) 0:00:23.640 ********* 2026-04-07 00:43:09.264352 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:09.264363 | orchestrator | 2026-04-07 00:43:09.264375 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264387 | orchestrator | Tuesday 07 April 2026 00:43:08 +0000 (0:00:00.176) 0:00:23.816 ********* 2026-04-07 00:43:09.264405 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:09.264424 | orchestrator | 2026-04-07 00:43:09.264437 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264477 | orchestrator | Tuesday 07 April 2026 00:43:08 +0000 (0:00:00.518) 0:00:24.335 ********* 2026-04-07 00:43:09.264489 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:09.264500 | orchestrator | 2026-04-07 00:43:09.264511 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:09.264522 | orchestrator | Tuesday 07 April 2026 00:43:09 +0000 (0:00:00.193) 0:00:24.529 ********* 2026-04-07 00:43:09.264533 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:09.264544 | orchestrator | 2026-04-07 00:43:09.264566 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.259649 | orchestrator | Tuesday 07 April 2026 00:43:09 +0000 (0:00:00.190) 0:00:24.720 ********* 2026-04-07 00:43:19.259753 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.259770 | orchestrator | 2026-04-07 00:43:19.259783 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.259796 | orchestrator | Tuesday 07 April 2026 00:43:09 +0000 (0:00:00.180) 0:00:24.900 ********* 2026-04-07 00:43:19.259807 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.259818 | orchestrator | 2026-04-07 00:43:19.259830 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.259841 | orchestrator | Tuesday 07 April 2026 00:43:09 +0000 (0:00:00.181) 0:00:25.082 ********* 2026-04-07 00:43:19.259852 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82) 2026-04-07 00:43:19.259864 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82) 2026-04-07 00:43:19.259875 | orchestrator | 2026-04-07 00:43:19.259886 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.259897 | orchestrator | Tuesday 07 April 2026 00:43:10 +0000 (0:00:00.401) 0:00:25.484 ********* 2026-04-07 00:43:19.259908 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba) 2026-04-07 00:43:19.259919 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba) 2026-04-07 00:43:19.259930 | orchestrator | 2026-04-07 00:43:19.259941 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.259968 | orchestrator | Tuesday 07 April 2026 00:43:10 +0000 (0:00:00.402) 0:00:25.886 ********* 2026-04-07 00:43:19.259980 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6) 2026-04-07 00:43:19.259991 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6) 2026-04-07 00:43:19.260002 | orchestrator | 2026-04-07 00:43:19.260013 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.260024 | orchestrator | Tuesday 07 April 2026 00:43:10 +0000 (0:00:00.410) 0:00:26.296 ********* 2026-04-07 00:43:19.260035 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696) 2026-04-07 00:43:19.260064 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696) 2026-04-07 00:43:19.260075 | orchestrator | 2026-04-07 00:43:19.260086 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:19.260097 | orchestrator | Tuesday 07 April 2026 00:43:11 +0000 (0:00:00.415) 0:00:26.711 ********* 2026-04-07 00:43:19.260108 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-07 00:43:19.260119 | orchestrator | 2026-04-07 00:43:19.260129 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260140 | orchestrator | Tuesday 07 April 2026 00:43:11 +0000 (0:00:00.317) 0:00:27.028 ********* 2026-04-07 00:43:19.260151 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2026-04-07 00:43:19.260162 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2026-04-07 00:43:19.260175 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2026-04-07 00:43:19.260188 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2026-04-07 00:43:19.260201 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2026-04-07 00:43:19.260213 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2026-04-07 00:43:19.260226 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2026-04-07 00:43:19.260238 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2026-04-07 00:43:19.260252 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2026-04-07 00:43:19.260265 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2026-04-07 00:43:19.260278 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2026-04-07 00:43:19.260291 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2026-04-07 00:43:19.260305 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2026-04-07 00:43:19.260317 | orchestrator | 2026-04-07 00:43:19.260330 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260343 | orchestrator | Tuesday 07 April 2026 00:43:12 +0000 (0:00:00.592) 0:00:27.621 ********* 2026-04-07 00:43:19.260355 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260368 | orchestrator | 2026-04-07 00:43:19.260381 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260394 | orchestrator | Tuesday 07 April 2026 00:43:12 +0000 (0:00:00.189) 0:00:27.810 ********* 2026-04-07 00:43:19.260406 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260418 | orchestrator | 2026-04-07 00:43:19.260432 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260468 | orchestrator | Tuesday 07 April 2026 00:43:12 +0000 (0:00:00.189) 0:00:28.000 ********* 2026-04-07 00:43:19.260480 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260493 | orchestrator | 2026-04-07 00:43:19.260523 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260535 | orchestrator | Tuesday 07 April 2026 00:43:12 +0000 (0:00:00.193) 0:00:28.193 ********* 2026-04-07 00:43:19.260545 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260556 | orchestrator | 2026-04-07 00:43:19.260568 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260579 | orchestrator | Tuesday 07 April 2026 00:43:12 +0000 (0:00:00.176) 0:00:28.370 ********* 2026-04-07 00:43:19.260589 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260600 | orchestrator | 2026-04-07 00:43:19.260611 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260629 | orchestrator | Tuesday 07 April 2026 00:43:13 +0000 (0:00:00.190) 0:00:28.560 ********* 2026-04-07 00:43:19.260640 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260651 | orchestrator | 2026-04-07 00:43:19.260663 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260673 | orchestrator | Tuesday 07 April 2026 00:43:13 +0000 (0:00:00.194) 0:00:28.755 ********* 2026-04-07 00:43:19.260684 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260695 | orchestrator | 2026-04-07 00:43:19.260706 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260718 | orchestrator | Tuesday 07 April 2026 00:43:13 +0000 (0:00:00.189) 0:00:28.944 ********* 2026-04-07 00:43:19.260728 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260739 | orchestrator | 2026-04-07 00:43:19.260750 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260761 | orchestrator | Tuesday 07 April 2026 00:43:13 +0000 (0:00:00.184) 0:00:29.128 ********* 2026-04-07 00:43:19.260773 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2026-04-07 00:43:19.260791 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2026-04-07 00:43:19.260803 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2026-04-07 00:43:19.260814 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2026-04-07 00:43:19.260825 | orchestrator | 2026-04-07 00:43:19.260836 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260847 | orchestrator | Tuesday 07 April 2026 00:43:14 +0000 (0:00:00.805) 0:00:29.933 ********* 2026-04-07 00:43:19.260858 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260868 | orchestrator | 2026-04-07 00:43:19.260879 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260890 | orchestrator | Tuesday 07 April 2026 00:43:14 +0000 (0:00:00.177) 0:00:30.110 ********* 2026-04-07 00:43:19.260901 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260912 | orchestrator | 2026-04-07 00:43:19.260923 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260934 | orchestrator | Tuesday 07 April 2026 00:43:14 +0000 (0:00:00.194) 0:00:30.305 ********* 2026-04-07 00:43:19.260944 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.260955 | orchestrator | 2026-04-07 00:43:19.260966 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:19.260978 | orchestrator | Tuesday 07 April 2026 00:43:15 +0000 (0:00:00.633) 0:00:30.938 ********* 2026-04-07 00:43:19.260989 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.261000 | orchestrator | 2026-04-07 00:43:19.261010 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-04-07 00:43:19.261021 | orchestrator | Tuesday 07 April 2026 00:43:15 +0000 (0:00:00.196) 0:00:31.135 ********* 2026-04-07 00:43:19.261032 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.261043 | orchestrator | 2026-04-07 00:43:19.261054 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-04-07 00:43:19.261065 | orchestrator | Tuesday 07 April 2026 00:43:15 +0000 (0:00:00.126) 0:00:31.262 ********* 2026-04-07 00:43:19.261076 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f75c5f18-ff10-5900-9978-917c146f798b'}}) 2026-04-07 00:43:19.261087 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '47815a29-012a-570b-a074-b4436c47a2f4'}}) 2026-04-07 00:43:19.261098 | orchestrator | 2026-04-07 00:43:19.261109 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-04-07 00:43:19.261120 | orchestrator | Tuesday 07 April 2026 00:43:15 +0000 (0:00:00.185) 0:00:31.447 ********* 2026-04-07 00:43:19.261131 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'}) 2026-04-07 00:43:19.261144 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'}) 2026-04-07 00:43:19.261161 | orchestrator | 2026-04-07 00:43:19.261172 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-04-07 00:43:19.261183 | orchestrator | Tuesday 07 April 2026 00:43:17 +0000 (0:00:01.844) 0:00:33.291 ********* 2026-04-07 00:43:19.261194 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:19.261207 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:19.261218 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:19.261228 | orchestrator | 2026-04-07 00:43:19.261239 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-04-07 00:43:19.261250 | orchestrator | Tuesday 07 April 2026 00:43:17 +0000 (0:00:00.148) 0:00:33.440 ********* 2026-04-07 00:43:19.261261 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'}) 2026-04-07 00:43:19.261279 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'}) 2026-04-07 00:43:24.430138 | orchestrator | 2026-04-07 00:43:24.430263 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-04-07 00:43:24.430291 | orchestrator | Tuesday 07 April 2026 00:43:19 +0000 (0:00:01.355) 0:00:34.795 ********* 2026-04-07 00:43:24.430309 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.430328 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.430345 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.430363 | orchestrator | 2026-04-07 00:43:24.430382 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-04-07 00:43:24.430401 | orchestrator | Tuesday 07 April 2026 00:43:19 +0000 (0:00:00.146) 0:00:34.941 ********* 2026-04-07 00:43:24.430417 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.430432 | orchestrator | 2026-04-07 00:43:24.430559 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-04-07 00:43:24.430579 | orchestrator | Tuesday 07 April 2026 00:43:19 +0000 (0:00:00.141) 0:00:35.083 ********* 2026-04-07 00:43:24.430620 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.430643 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.430664 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.430686 | orchestrator | 2026-04-07 00:43:24.430707 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-04-07 00:43:24.430728 | orchestrator | Tuesday 07 April 2026 00:43:19 +0000 (0:00:00.155) 0:00:35.239 ********* 2026-04-07 00:43:24.430748 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.430768 | orchestrator | 2026-04-07 00:43:24.430788 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-04-07 00:43:24.430810 | orchestrator | Tuesday 07 April 2026 00:43:19 +0000 (0:00:00.160) 0:00:35.399 ********* 2026-04-07 00:43:24.430830 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.430851 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.430871 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.430920 | orchestrator | 2026-04-07 00:43:24.430933 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-04-07 00:43:24.430945 | orchestrator | Tuesday 07 April 2026 00:43:20 +0000 (0:00:00.136) 0:00:35.535 ********* 2026-04-07 00:43:24.430956 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.430967 | orchestrator | 2026-04-07 00:43:24.430979 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-04-07 00:43:24.430990 | orchestrator | Tuesday 07 April 2026 00:43:20 +0000 (0:00:00.360) 0:00:35.896 ********* 2026-04-07 00:43:24.431002 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.431013 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.431024 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431035 | orchestrator | 2026-04-07 00:43:24.431046 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-04-07 00:43:24.431057 | orchestrator | Tuesday 07 April 2026 00:43:20 +0000 (0:00:00.139) 0:00:36.036 ********* 2026-04-07 00:43:24.431068 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:24.431080 | orchestrator | 2026-04-07 00:43:24.431091 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-04-07 00:43:24.431102 | orchestrator | Tuesday 07 April 2026 00:43:20 +0000 (0:00:00.113) 0:00:36.149 ********* 2026-04-07 00:43:24.431114 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.431125 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.431136 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431147 | orchestrator | 2026-04-07 00:43:24.431175 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-04-07 00:43:24.431196 | orchestrator | Tuesday 07 April 2026 00:43:20 +0000 (0:00:00.151) 0:00:36.300 ********* 2026-04-07 00:43:24.431208 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.431219 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.431230 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431241 | orchestrator | 2026-04-07 00:43:24.431252 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-04-07 00:43:24.431288 | orchestrator | Tuesday 07 April 2026 00:43:20 +0000 (0:00:00.159) 0:00:36.460 ********* 2026-04-07 00:43:24.431309 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:24.431328 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:24.431346 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431365 | orchestrator | 2026-04-07 00:43:24.431384 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-04-07 00:43:24.431405 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.116) 0:00:36.576 ********* 2026-04-07 00:43:24.431423 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431470 | orchestrator | 2026-04-07 00:43:24.431483 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-04-07 00:43:24.431494 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.108) 0:00:36.684 ********* 2026-04-07 00:43:24.431505 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431528 | orchestrator | 2026-04-07 00:43:24.431539 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-04-07 00:43:24.431550 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.124) 0:00:36.808 ********* 2026-04-07 00:43:24.431561 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.431573 | orchestrator | 2026-04-07 00:43:24.431592 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-04-07 00:43:24.431603 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.114) 0:00:36.923 ********* 2026-04-07 00:43:24.431614 | orchestrator | ok: [testbed-node-4] => { 2026-04-07 00:43:24.431626 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-04-07 00:43:24.431638 | orchestrator | } 2026-04-07 00:43:24.431649 | orchestrator | 2026-04-07 00:43:24.431660 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-04-07 00:43:24.431672 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.116) 0:00:37.039 ********* 2026-04-07 00:43:24.431683 | orchestrator | ok: [testbed-node-4] => { 2026-04-07 00:43:24.431694 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-04-07 00:43:24.431705 | orchestrator | } 2026-04-07 00:43:24.431716 | orchestrator | 2026-04-07 00:43:24.431728 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-04-07 00:43:24.431739 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.127) 0:00:37.166 ********* 2026-04-07 00:43:24.431750 | orchestrator | ok: [testbed-node-4] => { 2026-04-07 00:43:24.431761 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-04-07 00:43:24.431772 | orchestrator | } 2026-04-07 00:43:24.431784 | orchestrator | 2026-04-07 00:43:24.431795 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-04-07 00:43:24.431806 | orchestrator | Tuesday 07 April 2026 00:43:21 +0000 (0:00:00.114) 0:00:37.280 ********* 2026-04-07 00:43:24.431818 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:24.431829 | orchestrator | 2026-04-07 00:43:24.431840 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-04-07 00:43:24.431851 | orchestrator | Tuesday 07 April 2026 00:43:22 +0000 (0:00:00.661) 0:00:37.942 ********* 2026-04-07 00:43:24.431862 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:24.431873 | orchestrator | 2026-04-07 00:43:24.431884 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-04-07 00:43:24.431896 | orchestrator | Tuesday 07 April 2026 00:43:22 +0000 (0:00:00.498) 0:00:38.440 ********* 2026-04-07 00:43:24.431907 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:24.431918 | orchestrator | 2026-04-07 00:43:24.431929 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-04-07 00:43:24.431940 | orchestrator | Tuesday 07 April 2026 00:43:23 +0000 (0:00:00.542) 0:00:38.983 ********* 2026-04-07 00:43:24.431952 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:24.431963 | orchestrator | 2026-04-07 00:43:24.431974 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-04-07 00:43:24.431985 | orchestrator | Tuesday 07 April 2026 00:43:23 +0000 (0:00:00.134) 0:00:39.118 ********* 2026-04-07 00:43:24.431997 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.432007 | orchestrator | 2026-04-07 00:43:24.432019 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-04-07 00:43:24.432030 | orchestrator | Tuesday 07 April 2026 00:43:23 +0000 (0:00:00.096) 0:00:39.214 ********* 2026-04-07 00:43:24.432041 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.432052 | orchestrator | 2026-04-07 00:43:24.432063 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-04-07 00:43:24.432075 | orchestrator | Tuesday 07 April 2026 00:43:23 +0000 (0:00:00.092) 0:00:39.306 ********* 2026-04-07 00:43:24.432086 | orchestrator | ok: [testbed-node-4] => { 2026-04-07 00:43:24.432098 | orchestrator |  "vgs_report": { 2026-04-07 00:43:24.432110 | orchestrator |  "vg": [] 2026-04-07 00:43:24.432121 | orchestrator |  } 2026-04-07 00:43:24.432133 | orchestrator | } 2026-04-07 00:43:24.432144 | orchestrator | 2026-04-07 00:43:24.432155 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-04-07 00:43:24.432173 | orchestrator | Tuesday 07 April 2026 00:43:23 +0000 (0:00:00.130) 0:00:39.437 ********* 2026-04-07 00:43:24.432185 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.432196 | orchestrator | 2026-04-07 00:43:24.432207 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-04-07 00:43:24.432218 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.120) 0:00:39.557 ********* 2026-04-07 00:43:24.432230 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.432241 | orchestrator | 2026-04-07 00:43:24.432252 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-04-07 00:43:24.432263 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.102) 0:00:39.660 ********* 2026-04-07 00:43:24.432274 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.432285 | orchestrator | 2026-04-07 00:43:24.432297 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-04-07 00:43:24.432308 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.105) 0:00:39.765 ********* 2026-04-07 00:43:24.432319 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:24.432331 | orchestrator | 2026-04-07 00:43:24.432352 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-04-07 00:43:28.337100 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.124) 0:00:39.889 ********* 2026-04-07 00:43:28.337181 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337190 | orchestrator | 2026-04-07 00:43:28.337197 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-04-07 00:43:28.337203 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.121) 0:00:40.011 ********* 2026-04-07 00:43:28.337209 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337215 | orchestrator | 2026-04-07 00:43:28.337221 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-04-07 00:43:28.337227 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.248) 0:00:40.259 ********* 2026-04-07 00:43:28.337232 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337237 | orchestrator | 2026-04-07 00:43:28.337243 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-04-07 00:43:28.337249 | orchestrator | Tuesday 07 April 2026 00:43:24 +0000 (0:00:00.103) 0:00:40.363 ********* 2026-04-07 00:43:28.337254 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337259 | orchestrator | 2026-04-07 00:43:28.337265 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-04-07 00:43:28.337270 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.122) 0:00:40.485 ********* 2026-04-07 00:43:28.337276 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337281 | orchestrator | 2026-04-07 00:43:28.337287 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-04-07 00:43:28.337293 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.112) 0:00:40.597 ********* 2026-04-07 00:43:28.337298 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337304 | orchestrator | 2026-04-07 00:43:28.337309 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-04-07 00:43:28.337315 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.102) 0:00:40.700 ********* 2026-04-07 00:43:28.337320 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337326 | orchestrator | 2026-04-07 00:43:28.337331 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-04-07 00:43:28.337337 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.116) 0:00:40.816 ********* 2026-04-07 00:43:28.337342 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337348 | orchestrator | 2026-04-07 00:43:28.337354 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-04-07 00:43:28.337374 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.126) 0:00:40.942 ********* 2026-04-07 00:43:28.337380 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337386 | orchestrator | 2026-04-07 00:43:28.337391 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-04-07 00:43:28.337412 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.122) 0:00:41.065 ********* 2026-04-07 00:43:28.337418 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337424 | orchestrator | 2026-04-07 00:43:28.337429 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-04-07 00:43:28.337495 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.119) 0:00:41.184 ********* 2026-04-07 00:43:28.337502 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337508 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337514 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337519 | orchestrator | 2026-04-07 00:43:28.337525 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-04-07 00:43:28.337531 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.124) 0:00:41.309 ********* 2026-04-07 00:43:28.337536 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337542 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337547 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337553 | orchestrator | 2026-04-07 00:43:28.337558 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-04-07 00:43:28.337564 | orchestrator | Tuesday 07 April 2026 00:43:25 +0000 (0:00:00.117) 0:00:41.427 ********* 2026-04-07 00:43:28.337569 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337575 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337581 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337586 | orchestrator | 2026-04-07 00:43:28.337592 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-04-07 00:43:28.337597 | orchestrator | Tuesday 07 April 2026 00:43:26 +0000 (0:00:00.127) 0:00:41.555 ********* 2026-04-07 00:43:28.337603 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337608 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337614 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337620 | orchestrator | 2026-04-07 00:43:28.337638 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-04-07 00:43:28.337644 | orchestrator | Tuesday 07 April 2026 00:43:26 +0000 (0:00:00.242) 0:00:41.798 ********* 2026-04-07 00:43:28.337650 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337655 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337661 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337667 | orchestrator | 2026-04-07 00:43:28.337673 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-04-07 00:43:28.337680 | orchestrator | Tuesday 07 April 2026 00:43:26 +0000 (0:00:00.130) 0:00:41.929 ********* 2026-04-07 00:43:28.337686 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337701 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337707 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337713 | orchestrator | 2026-04-07 00:43:28.337720 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-04-07 00:43:28.337726 | orchestrator | Tuesday 07 April 2026 00:43:26 +0000 (0:00:00.128) 0:00:42.057 ********* 2026-04-07 00:43:28.337732 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337738 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337745 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337751 | orchestrator | 2026-04-07 00:43:28.337757 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-04-07 00:43:28.337763 | orchestrator | Tuesday 07 April 2026 00:43:26 +0000 (0:00:00.123) 0:00:42.180 ********* 2026-04-07 00:43:28.337770 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337776 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337782 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337789 | orchestrator | 2026-04-07 00:43:28.337795 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-04-07 00:43:28.337801 | orchestrator | Tuesday 07 April 2026 00:43:26 +0000 (0:00:00.138) 0:00:42.319 ********* 2026-04-07 00:43:28.337807 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:28.337813 | orchestrator | 2026-04-07 00:43:28.337820 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-04-07 00:43:28.337826 | orchestrator | Tuesday 07 April 2026 00:43:27 +0000 (0:00:00.506) 0:00:42.825 ********* 2026-04-07 00:43:28.337832 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:28.337838 | orchestrator | 2026-04-07 00:43:28.337845 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-04-07 00:43:28.337851 | orchestrator | Tuesday 07 April 2026 00:43:27 +0000 (0:00:00.492) 0:00:43.318 ********* 2026-04-07 00:43:28.337858 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:43:28.337864 | orchestrator | 2026-04-07 00:43:28.337870 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-04-07 00:43:28.337877 | orchestrator | Tuesday 07 April 2026 00:43:27 +0000 (0:00:00.138) 0:00:43.457 ********* 2026-04-07 00:43:28.337883 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'vg_name': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'}) 2026-04-07 00:43:28.337890 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'vg_name': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'}) 2026-04-07 00:43:28.337895 | orchestrator | 2026-04-07 00:43:28.337901 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-04-07 00:43:28.337906 | orchestrator | Tuesday 07 April 2026 00:43:28 +0000 (0:00:00.146) 0:00:43.603 ********* 2026-04-07 00:43:28.337912 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337917 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:28.337923 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:28.337928 | orchestrator | 2026-04-07 00:43:28.337937 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-04-07 00:43:28.337943 | orchestrator | Tuesday 07 April 2026 00:43:28 +0000 (0:00:00.130) 0:00:43.734 ********* 2026-04-07 00:43:28.337948 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:28.337958 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:33.458592 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:33.458717 | orchestrator | 2026-04-07 00:43:33.458735 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-04-07 00:43:33.458749 | orchestrator | Tuesday 07 April 2026 00:43:28 +0000 (0:00:00.140) 0:00:43.874 ********* 2026-04-07 00:43:33.458761 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'})  2026-04-07 00:43:33.458774 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'})  2026-04-07 00:43:33.458785 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:43:33.458797 | orchestrator | 2026-04-07 00:43:33.458808 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-04-07 00:43:33.458819 | orchestrator | Tuesday 07 April 2026 00:43:28 +0000 (0:00:00.131) 0:00:44.005 ********* 2026-04-07 00:43:33.458830 | orchestrator | ok: [testbed-node-4] => { 2026-04-07 00:43:33.458842 | orchestrator |  "lvm_report": { 2026-04-07 00:43:33.458856 | orchestrator |  "lv": [ 2026-04-07 00:43:33.458875 | orchestrator |  { 2026-04-07 00:43:33.458915 | orchestrator |  "lv_name": "osd-block-47815a29-012a-570b-a074-b4436c47a2f4", 2026-04-07 00:43:33.458935 | orchestrator |  "vg_name": "ceph-47815a29-012a-570b-a074-b4436c47a2f4" 2026-04-07 00:43:33.458952 | orchestrator |  }, 2026-04-07 00:43:33.458969 | orchestrator |  { 2026-04-07 00:43:33.458986 | orchestrator |  "lv_name": "osd-block-f75c5f18-ff10-5900-9978-917c146f798b", 2026-04-07 00:43:33.459002 | orchestrator |  "vg_name": "ceph-f75c5f18-ff10-5900-9978-917c146f798b" 2026-04-07 00:43:33.459017 | orchestrator |  } 2026-04-07 00:43:33.459032 | orchestrator |  ], 2026-04-07 00:43:33.459048 | orchestrator |  "pv": [ 2026-04-07 00:43:33.459065 | orchestrator |  { 2026-04-07 00:43:33.459083 | orchestrator |  "pv_name": "/dev/sdb", 2026-04-07 00:43:33.459102 | orchestrator |  "vg_name": "ceph-f75c5f18-ff10-5900-9978-917c146f798b" 2026-04-07 00:43:33.459121 | orchestrator |  }, 2026-04-07 00:43:33.459139 | orchestrator |  { 2026-04-07 00:43:33.459160 | orchestrator |  "pv_name": "/dev/sdc", 2026-04-07 00:43:33.459179 | orchestrator |  "vg_name": "ceph-47815a29-012a-570b-a074-b4436c47a2f4" 2026-04-07 00:43:33.459198 | orchestrator |  } 2026-04-07 00:43:33.459217 | orchestrator |  ] 2026-04-07 00:43:33.459237 | orchestrator |  } 2026-04-07 00:43:33.459256 | orchestrator | } 2026-04-07 00:43:33.459274 | orchestrator | 2026-04-07 00:43:33.459294 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-04-07 00:43:33.459314 | orchestrator | 2026-04-07 00:43:33.459333 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-07 00:43:33.459353 | orchestrator | Tuesday 07 April 2026 00:43:28 +0000 (0:00:00.372) 0:00:44.378 ********* 2026-04-07 00:43:33.459373 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-04-07 00:43:33.459393 | orchestrator | 2026-04-07 00:43:33.459411 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-07 00:43:33.459453 | orchestrator | Tuesday 07 April 2026 00:43:29 +0000 (0:00:00.224) 0:00:44.602 ********* 2026-04-07 00:43:33.459465 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:33.459503 | orchestrator | 2026-04-07 00:43:33.459515 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.459527 | orchestrator | Tuesday 07 April 2026 00:43:29 +0000 (0:00:00.213) 0:00:44.816 ********* 2026-04-07 00:43:33.459538 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2026-04-07 00:43:33.459549 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2026-04-07 00:43:33.459560 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2026-04-07 00:43:33.459571 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2026-04-07 00:43:33.459587 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2026-04-07 00:43:33.459598 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2026-04-07 00:43:33.459609 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2026-04-07 00:43:33.459620 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2026-04-07 00:43:33.459631 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2026-04-07 00:43:33.459641 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2026-04-07 00:43:33.459652 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2026-04-07 00:43:33.459667 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2026-04-07 00:43:33.459685 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2026-04-07 00:43:33.459704 | orchestrator | 2026-04-07 00:43:33.459720 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.459737 | orchestrator | Tuesday 07 April 2026 00:43:29 +0000 (0:00:00.357) 0:00:45.174 ********* 2026-04-07 00:43:33.459753 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.459770 | orchestrator | 2026-04-07 00:43:33.459786 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.459803 | orchestrator | Tuesday 07 April 2026 00:43:29 +0000 (0:00:00.175) 0:00:45.349 ********* 2026-04-07 00:43:33.459819 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.459836 | orchestrator | 2026-04-07 00:43:33.459854 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.459901 | orchestrator | Tuesday 07 April 2026 00:43:30 +0000 (0:00:00.178) 0:00:45.528 ********* 2026-04-07 00:43:33.459921 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.459941 | orchestrator | 2026-04-07 00:43:33.459961 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.459981 | orchestrator | Tuesday 07 April 2026 00:43:30 +0000 (0:00:00.175) 0:00:45.703 ********* 2026-04-07 00:43:33.460001 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.460020 | orchestrator | 2026-04-07 00:43:33.460038 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460055 | orchestrator | Tuesday 07 April 2026 00:43:30 +0000 (0:00:00.179) 0:00:45.883 ********* 2026-04-07 00:43:33.460074 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.460094 | orchestrator | 2026-04-07 00:43:33.460112 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460130 | orchestrator | Tuesday 07 April 2026 00:43:30 +0000 (0:00:00.167) 0:00:46.051 ********* 2026-04-07 00:43:33.460141 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.460152 | orchestrator | 2026-04-07 00:43:33.460163 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460174 | orchestrator | Tuesday 07 April 2026 00:43:31 +0000 (0:00:00.426) 0:00:46.477 ********* 2026-04-07 00:43:33.460186 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.460197 | orchestrator | 2026-04-07 00:43:33.460221 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460232 | orchestrator | Tuesday 07 April 2026 00:43:31 +0000 (0:00:00.177) 0:00:46.654 ********* 2026-04-07 00:43:33.460243 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:33.460254 | orchestrator | 2026-04-07 00:43:33.460265 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460276 | orchestrator | Tuesday 07 April 2026 00:43:31 +0000 (0:00:00.173) 0:00:46.827 ********* 2026-04-07 00:43:33.460287 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9) 2026-04-07 00:43:33.460299 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9) 2026-04-07 00:43:33.460310 | orchestrator | 2026-04-07 00:43:33.460320 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460331 | orchestrator | Tuesday 07 April 2026 00:43:31 +0000 (0:00:00.365) 0:00:47.192 ********* 2026-04-07 00:43:33.460342 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd) 2026-04-07 00:43:33.460353 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd) 2026-04-07 00:43:33.460364 | orchestrator | 2026-04-07 00:43:33.460375 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460386 | orchestrator | Tuesday 07 April 2026 00:43:32 +0000 (0:00:00.384) 0:00:47.577 ********* 2026-04-07 00:43:33.460397 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349) 2026-04-07 00:43:33.460408 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349) 2026-04-07 00:43:33.460419 | orchestrator | 2026-04-07 00:43:33.460468 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460512 | orchestrator | Tuesday 07 April 2026 00:43:32 +0000 (0:00:00.368) 0:00:47.946 ********* 2026-04-07 00:43:33.460523 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8) 2026-04-07 00:43:33.460534 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8) 2026-04-07 00:43:33.460545 | orchestrator | 2026-04-07 00:43:33.460555 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-07 00:43:33.460566 | orchestrator | Tuesday 07 April 2026 00:43:32 +0000 (0:00:00.382) 0:00:48.328 ********* 2026-04-07 00:43:33.460577 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-07 00:43:33.460588 | orchestrator | 2026-04-07 00:43:33.460599 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:33.460610 | orchestrator | Tuesday 07 April 2026 00:43:33 +0000 (0:00:00.304) 0:00:48.633 ********* 2026-04-07 00:43:33.460620 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2026-04-07 00:43:33.460632 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2026-04-07 00:43:33.460642 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2026-04-07 00:43:33.460653 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2026-04-07 00:43:33.460664 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2026-04-07 00:43:33.460675 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2026-04-07 00:43:33.460686 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2026-04-07 00:43:33.460696 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2026-04-07 00:43:33.460707 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2026-04-07 00:43:33.460773 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2026-04-07 00:43:33.460786 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2026-04-07 00:43:33.460822 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2026-04-07 00:43:41.306388 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2026-04-07 00:43:41.306557 | orchestrator | 2026-04-07 00:43:41.306575 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306587 | orchestrator | Tuesday 07 April 2026 00:43:33 +0000 (0:00:00.359) 0:00:48.993 ********* 2026-04-07 00:43:41.306599 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306611 | orchestrator | 2026-04-07 00:43:41.306623 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306635 | orchestrator | Tuesday 07 April 2026 00:43:33 +0000 (0:00:00.172) 0:00:49.165 ********* 2026-04-07 00:43:41.306646 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306657 | orchestrator | 2026-04-07 00:43:41.306668 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306679 | orchestrator | Tuesday 07 April 2026 00:43:33 +0000 (0:00:00.186) 0:00:49.352 ********* 2026-04-07 00:43:41.306691 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306702 | orchestrator | 2026-04-07 00:43:41.306713 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306741 | orchestrator | Tuesday 07 April 2026 00:43:34 +0000 (0:00:00.449) 0:00:49.801 ********* 2026-04-07 00:43:41.306753 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306764 | orchestrator | 2026-04-07 00:43:41.306776 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306787 | orchestrator | Tuesday 07 April 2026 00:43:34 +0000 (0:00:00.179) 0:00:49.981 ********* 2026-04-07 00:43:41.306798 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306809 | orchestrator | 2026-04-07 00:43:41.306825 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306843 | orchestrator | Tuesday 07 April 2026 00:43:34 +0000 (0:00:00.171) 0:00:50.153 ********* 2026-04-07 00:43:41.306862 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306880 | orchestrator | 2026-04-07 00:43:41.306899 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306916 | orchestrator | Tuesday 07 April 2026 00:43:34 +0000 (0:00:00.180) 0:00:50.334 ********* 2026-04-07 00:43:41.306935 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.306952 | orchestrator | 2026-04-07 00:43:41.306971 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.306989 | orchestrator | Tuesday 07 April 2026 00:43:35 +0000 (0:00:00.176) 0:00:50.510 ********* 2026-04-07 00:43:41.307006 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307025 | orchestrator | 2026-04-07 00:43:41.307044 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.307063 | orchestrator | Tuesday 07 April 2026 00:43:35 +0000 (0:00:00.174) 0:00:50.685 ********* 2026-04-07 00:43:41.307081 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2026-04-07 00:43:41.307101 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2026-04-07 00:43:41.307118 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2026-04-07 00:43:41.307136 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2026-04-07 00:43:41.307153 | orchestrator | 2026-04-07 00:43:41.307170 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.307189 | orchestrator | Tuesday 07 April 2026 00:43:35 +0000 (0:00:00.594) 0:00:51.280 ********* 2026-04-07 00:43:41.307208 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307226 | orchestrator | 2026-04-07 00:43:41.307245 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.307265 | orchestrator | Tuesday 07 April 2026 00:43:35 +0000 (0:00:00.178) 0:00:51.458 ********* 2026-04-07 00:43:41.307310 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307329 | orchestrator | 2026-04-07 00:43:41.307346 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.307363 | orchestrator | Tuesday 07 April 2026 00:43:36 +0000 (0:00:00.170) 0:00:51.629 ********* 2026-04-07 00:43:41.307381 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307399 | orchestrator | 2026-04-07 00:43:41.307416 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-07 00:43:41.307467 | orchestrator | Tuesday 07 April 2026 00:43:36 +0000 (0:00:00.169) 0:00:51.799 ********* 2026-04-07 00:43:41.307486 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307503 | orchestrator | 2026-04-07 00:43:41.307521 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-04-07 00:43:41.307537 | orchestrator | Tuesday 07 April 2026 00:43:36 +0000 (0:00:00.173) 0:00:51.972 ********* 2026-04-07 00:43:41.307553 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307570 | orchestrator | 2026-04-07 00:43:41.307587 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-04-07 00:43:41.307604 | orchestrator | Tuesday 07 April 2026 00:43:36 +0000 (0:00:00.117) 0:00:52.090 ********* 2026-04-07 00:43:41.307622 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0842dd12-8111-558f-8152-9e8987e1446c'}}) 2026-04-07 00:43:41.307639 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}}) 2026-04-07 00:43:41.307655 | orchestrator | 2026-04-07 00:43:41.307672 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-04-07 00:43:41.307689 | orchestrator | Tuesday 07 April 2026 00:43:36 +0000 (0:00:00.286) 0:00:52.376 ********* 2026-04-07 00:43:41.307708 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'}) 2026-04-07 00:43:41.307728 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}) 2026-04-07 00:43:41.307744 | orchestrator | 2026-04-07 00:43:41.307762 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-04-07 00:43:41.307808 | orchestrator | Tuesday 07 April 2026 00:43:38 +0000 (0:00:01.864) 0:00:54.240 ********* 2026-04-07 00:43:41.307828 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:41.307848 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:41.307866 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.307885 | orchestrator | 2026-04-07 00:43:41.307904 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-04-07 00:43:41.307922 | orchestrator | Tuesday 07 April 2026 00:43:38 +0000 (0:00:00.135) 0:00:54.375 ********* 2026-04-07 00:43:41.307940 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'}) 2026-04-07 00:43:41.307972 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}) 2026-04-07 00:43:41.307991 | orchestrator | 2026-04-07 00:43:41.308009 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-04-07 00:43:41.308026 | orchestrator | Tuesday 07 April 2026 00:43:40 +0000 (0:00:01.317) 0:00:55.693 ********* 2026-04-07 00:43:41.308044 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:41.308063 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:41.308101 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308121 | orchestrator | 2026-04-07 00:43:41.308140 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-04-07 00:43:41.308159 | orchestrator | Tuesday 07 April 2026 00:43:40 +0000 (0:00:00.127) 0:00:55.821 ********* 2026-04-07 00:43:41.308177 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308195 | orchestrator | 2026-04-07 00:43:41.308214 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-04-07 00:43:41.308232 | orchestrator | Tuesday 07 April 2026 00:43:40 +0000 (0:00:00.121) 0:00:55.943 ********* 2026-04-07 00:43:41.308251 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:41.308272 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:41.308290 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308308 | orchestrator | 2026-04-07 00:43:41.308326 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-04-07 00:43:41.308343 | orchestrator | Tuesday 07 April 2026 00:43:40 +0000 (0:00:00.130) 0:00:56.073 ********* 2026-04-07 00:43:41.308361 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308381 | orchestrator | 2026-04-07 00:43:41.308398 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-04-07 00:43:41.308418 | orchestrator | Tuesday 07 April 2026 00:43:40 +0000 (0:00:00.125) 0:00:56.199 ********* 2026-04-07 00:43:41.308467 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:41.308487 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:41.308507 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308526 | orchestrator | 2026-04-07 00:43:41.308544 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-04-07 00:43:41.308556 | orchestrator | Tuesday 07 April 2026 00:43:40 +0000 (0:00:00.143) 0:00:56.343 ********* 2026-04-07 00:43:41.308567 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308578 | orchestrator | 2026-04-07 00:43:41.308589 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-04-07 00:43:41.308600 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.122) 0:00:56.465 ********* 2026-04-07 00:43:41.308611 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:41.308622 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:41.308634 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:41.308645 | orchestrator | 2026-04-07 00:43:41.308655 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-04-07 00:43:41.308666 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.131) 0:00:56.596 ********* 2026-04-07 00:43:41.308678 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:41.308689 | orchestrator | 2026-04-07 00:43:41.308700 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-04-07 00:43:41.308711 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.111) 0:00:56.707 ********* 2026-04-07 00:43:41.308737 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:46.846881 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:46.847017 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847034 | orchestrator | 2026-04-07 00:43:46.847047 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-04-07 00:43:46.847061 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.254) 0:00:56.962 ********* 2026-04-07 00:43:46.847072 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:46.847084 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:46.847094 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847105 | orchestrator | 2026-04-07 00:43:46.847116 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-04-07 00:43:46.847142 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.136) 0:00:57.098 ********* 2026-04-07 00:43:46.847154 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:46.847165 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:46.847176 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847187 | orchestrator | 2026-04-07 00:43:46.847198 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-04-07 00:43:46.847210 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.129) 0:00:57.228 ********* 2026-04-07 00:43:46.847221 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847232 | orchestrator | 2026-04-07 00:43:46.847243 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-04-07 00:43:46.847254 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.106) 0:00:57.334 ********* 2026-04-07 00:43:46.847264 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847281 | orchestrator | 2026-04-07 00:43:46.847299 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-04-07 00:43:46.847318 | orchestrator | Tuesday 07 April 2026 00:43:41 +0000 (0:00:00.126) 0:00:57.461 ********* 2026-04-07 00:43:46.847336 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847353 | orchestrator | 2026-04-07 00:43:46.847372 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-04-07 00:43:46.847391 | orchestrator | Tuesday 07 April 2026 00:43:42 +0000 (0:00:00.120) 0:00:57.582 ********* 2026-04-07 00:43:46.847408 | orchestrator | ok: [testbed-node-5] => { 2026-04-07 00:43:46.847499 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-04-07 00:43:46.847519 | orchestrator | } 2026-04-07 00:43:46.847538 | orchestrator | 2026-04-07 00:43:46.847556 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-04-07 00:43:46.847575 | orchestrator | Tuesday 07 April 2026 00:43:42 +0000 (0:00:00.126) 0:00:57.708 ********* 2026-04-07 00:43:46.847594 | orchestrator | ok: [testbed-node-5] => { 2026-04-07 00:43:46.847614 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-04-07 00:43:46.847633 | orchestrator | } 2026-04-07 00:43:46.847653 | orchestrator | 2026-04-07 00:43:46.847666 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-04-07 00:43:46.847680 | orchestrator | Tuesday 07 April 2026 00:43:42 +0000 (0:00:00.126) 0:00:57.835 ********* 2026-04-07 00:43:46.847691 | orchestrator | ok: [testbed-node-5] => { 2026-04-07 00:43:46.847702 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-04-07 00:43:46.847713 | orchestrator | } 2026-04-07 00:43:46.847724 | orchestrator | 2026-04-07 00:43:46.847748 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-04-07 00:43:46.847759 | orchestrator | Tuesday 07 April 2026 00:43:42 +0000 (0:00:00.127) 0:00:57.962 ********* 2026-04-07 00:43:46.847782 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:46.847793 | orchestrator | 2026-04-07 00:43:46.847804 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-04-07 00:43:46.847815 | orchestrator | Tuesday 07 April 2026 00:43:43 +0000 (0:00:00.518) 0:00:58.481 ********* 2026-04-07 00:43:46.847825 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:46.847836 | orchestrator | 2026-04-07 00:43:46.847847 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-04-07 00:43:46.847858 | orchestrator | Tuesday 07 April 2026 00:43:43 +0000 (0:00:00.496) 0:00:58.978 ********* 2026-04-07 00:43:46.847869 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:46.847879 | orchestrator | 2026-04-07 00:43:46.847891 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-04-07 00:43:46.847901 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.529) 0:00:59.507 ********* 2026-04-07 00:43:46.847912 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:46.847923 | orchestrator | 2026-04-07 00:43:46.847934 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-04-07 00:43:46.847945 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.320) 0:00:59.828 ********* 2026-04-07 00:43:46.847957 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.847967 | orchestrator | 2026-04-07 00:43:46.847978 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-04-07 00:43:46.847989 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.107) 0:00:59.936 ********* 2026-04-07 00:43:46.848000 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848011 | orchestrator | 2026-04-07 00:43:46.848022 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-04-07 00:43:46.848032 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.097) 0:01:00.033 ********* 2026-04-07 00:43:46.848044 | orchestrator | ok: [testbed-node-5] => { 2026-04-07 00:43:46.848055 | orchestrator |  "vgs_report": { 2026-04-07 00:43:46.848066 | orchestrator |  "vg": [] 2026-04-07 00:43:46.848097 | orchestrator |  } 2026-04-07 00:43:46.848109 | orchestrator | } 2026-04-07 00:43:46.848120 | orchestrator | 2026-04-07 00:43:46.848131 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-04-07 00:43:46.848142 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.130) 0:01:00.164 ********* 2026-04-07 00:43:46.848153 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848164 | orchestrator | 2026-04-07 00:43:46.848175 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-04-07 00:43:46.848186 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.121) 0:01:00.285 ********* 2026-04-07 00:43:46.848196 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848207 | orchestrator | 2026-04-07 00:43:46.848218 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-04-07 00:43:46.848229 | orchestrator | Tuesday 07 April 2026 00:43:44 +0000 (0:00:00.127) 0:01:00.412 ********* 2026-04-07 00:43:46.848240 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848251 | orchestrator | 2026-04-07 00:43:46.848262 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-04-07 00:43:46.848273 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.115) 0:01:00.527 ********* 2026-04-07 00:43:46.848284 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848295 | orchestrator | 2026-04-07 00:43:46.848306 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-04-07 00:43:46.848317 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.126) 0:01:00.654 ********* 2026-04-07 00:43:46.848328 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848338 | orchestrator | 2026-04-07 00:43:46.848349 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-04-07 00:43:46.848360 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.130) 0:01:00.785 ********* 2026-04-07 00:43:46.848371 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848382 | orchestrator | 2026-04-07 00:43:46.848393 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-04-07 00:43:46.848411 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.110) 0:01:00.896 ********* 2026-04-07 00:43:46.848438 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848449 | orchestrator | 2026-04-07 00:43:46.848460 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-04-07 00:43:46.848471 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.133) 0:01:01.030 ********* 2026-04-07 00:43:46.848482 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848492 | orchestrator | 2026-04-07 00:43:46.848503 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-04-07 00:43:46.848514 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.118) 0:01:01.148 ********* 2026-04-07 00:43:46.848525 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848536 | orchestrator | 2026-04-07 00:43:46.848547 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-04-07 00:43:46.848558 | orchestrator | Tuesday 07 April 2026 00:43:45 +0000 (0:00:00.226) 0:01:01.375 ********* 2026-04-07 00:43:46.848569 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848580 | orchestrator | 2026-04-07 00:43:46.848591 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-04-07 00:43:46.848602 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.124) 0:01:01.500 ********* 2026-04-07 00:43:46.848612 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848623 | orchestrator | 2026-04-07 00:43:46.848635 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-04-07 00:43:46.848645 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.123) 0:01:01.623 ********* 2026-04-07 00:43:46.848656 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848667 | orchestrator | 2026-04-07 00:43:46.848678 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-04-07 00:43:46.848689 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.120) 0:01:01.744 ********* 2026-04-07 00:43:46.848700 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848710 | orchestrator | 2026-04-07 00:43:46.848721 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-04-07 00:43:46.848732 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.112) 0:01:01.856 ********* 2026-04-07 00:43:46.848743 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848754 | orchestrator | 2026-04-07 00:43:46.848765 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-04-07 00:43:46.848776 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.111) 0:01:01.968 ********* 2026-04-07 00:43:46.848787 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:46.848798 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:46.848809 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848820 | orchestrator | 2026-04-07 00:43:46.848831 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-04-07 00:43:46.848842 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.144) 0:01:02.113 ********* 2026-04-07 00:43:46.848853 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:46.848873 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:46.848904 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:46.848924 | orchestrator | 2026-04-07 00:43:46.848941 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-04-07 00:43:46.848959 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.137) 0:01:02.250 ********* 2026-04-07 00:43:46.848999 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.531988 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532089 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532106 | orchestrator | 2026-04-07 00:43:49.532118 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-04-07 00:43:49.532130 | orchestrator | Tuesday 07 April 2026 00:43:46 +0000 (0:00:00.135) 0:01:02.386 ********* 2026-04-07 00:43:49.532140 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532168 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532180 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532191 | orchestrator | 2026-04-07 00:43:49.532202 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-04-07 00:43:49.532212 | orchestrator | Tuesday 07 April 2026 00:43:47 +0000 (0:00:00.143) 0:01:02.529 ********* 2026-04-07 00:43:49.532222 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532231 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532240 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532250 | orchestrator | 2026-04-07 00:43:49.532262 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-04-07 00:43:49.532272 | orchestrator | Tuesday 07 April 2026 00:43:47 +0000 (0:00:00.137) 0:01:02.667 ********* 2026-04-07 00:43:49.532281 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532290 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532301 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532310 | orchestrator | 2026-04-07 00:43:49.532320 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-04-07 00:43:49.532342 | orchestrator | Tuesday 07 April 2026 00:43:47 +0000 (0:00:00.127) 0:01:02.795 ********* 2026-04-07 00:43:49.532353 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532364 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532370 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532377 | orchestrator | 2026-04-07 00:43:49.532386 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-04-07 00:43:49.532396 | orchestrator | Tuesday 07 April 2026 00:43:47 +0000 (0:00:00.251) 0:01:03.046 ********* 2026-04-07 00:43:49.532408 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532462 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532471 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532478 | orchestrator | 2026-04-07 00:43:49.532484 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-04-07 00:43:49.532512 | orchestrator | Tuesday 07 April 2026 00:43:47 +0000 (0:00:00.139) 0:01:03.185 ********* 2026-04-07 00:43:49.532518 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:49.532527 | orchestrator | 2026-04-07 00:43:49.532535 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-04-07 00:43:49.532542 | orchestrator | Tuesday 07 April 2026 00:43:48 +0000 (0:00:00.488) 0:01:03.673 ********* 2026-04-07 00:43:49.532549 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:49.532557 | orchestrator | 2026-04-07 00:43:49.532564 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-04-07 00:43:49.532571 | orchestrator | Tuesday 07 April 2026 00:43:48 +0000 (0:00:00.516) 0:01:04.189 ********* 2026-04-07 00:43:49.532579 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:43:49.532585 | orchestrator | 2026-04-07 00:43:49.532593 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-04-07 00:43:49.532600 | orchestrator | Tuesday 07 April 2026 00:43:48 +0000 (0:00:00.120) 0:01:04.310 ********* 2026-04-07 00:43:49.532608 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'vg_name': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'}) 2026-04-07 00:43:49.532617 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'vg_name': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}) 2026-04-07 00:43:49.532624 | orchestrator | 2026-04-07 00:43:49.532632 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-04-07 00:43:49.532639 | orchestrator | Tuesday 07 April 2026 00:43:48 +0000 (0:00:00.140) 0:01:04.450 ********* 2026-04-07 00:43:49.532662 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532670 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532677 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532684 | orchestrator | 2026-04-07 00:43:49.532694 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-04-07 00:43:49.532706 | orchestrator | Tuesday 07 April 2026 00:43:49 +0000 (0:00:00.136) 0:01:04.587 ********* 2026-04-07 00:43:49.532717 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532735 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532746 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532758 | orchestrator | 2026-04-07 00:43:49.532769 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-04-07 00:43:49.532780 | orchestrator | Tuesday 07 April 2026 00:43:49 +0000 (0:00:00.138) 0:01:04.725 ********* 2026-04-07 00:43:49.532790 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'})  2026-04-07 00:43:49.532801 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'})  2026-04-07 00:43:49.532808 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:43:49.532814 | orchestrator | 2026-04-07 00:43:49.532820 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-04-07 00:43:49.532826 | orchestrator | Tuesday 07 April 2026 00:43:49 +0000 (0:00:00.131) 0:01:04.857 ********* 2026-04-07 00:43:49.532832 | orchestrator | ok: [testbed-node-5] => { 2026-04-07 00:43:49.532839 | orchestrator |  "lvm_report": { 2026-04-07 00:43:49.532845 | orchestrator |  "lv": [ 2026-04-07 00:43:49.532851 | orchestrator |  { 2026-04-07 00:43:49.532858 | orchestrator |  "lv_name": "osd-block-0842dd12-8111-558f-8152-9e8987e1446c", 2026-04-07 00:43:49.532870 | orchestrator |  "vg_name": "ceph-0842dd12-8111-558f-8152-9e8987e1446c" 2026-04-07 00:43:49.532876 | orchestrator |  }, 2026-04-07 00:43:49.532882 | orchestrator |  { 2026-04-07 00:43:49.532889 | orchestrator |  "lv_name": "osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7", 2026-04-07 00:43:49.532895 | orchestrator |  "vg_name": "ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7" 2026-04-07 00:43:49.532901 | orchestrator |  } 2026-04-07 00:43:49.532907 | orchestrator |  ], 2026-04-07 00:43:49.532914 | orchestrator |  "pv": [ 2026-04-07 00:43:49.532920 | orchestrator |  { 2026-04-07 00:43:49.532926 | orchestrator |  "pv_name": "/dev/sdb", 2026-04-07 00:43:49.532932 | orchestrator |  "vg_name": "ceph-0842dd12-8111-558f-8152-9e8987e1446c" 2026-04-07 00:43:49.532939 | orchestrator |  }, 2026-04-07 00:43:49.532945 | orchestrator |  { 2026-04-07 00:43:49.532951 | orchestrator |  "pv_name": "/dev/sdc", 2026-04-07 00:43:49.532957 | orchestrator |  "vg_name": "ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7" 2026-04-07 00:43:49.532963 | orchestrator |  } 2026-04-07 00:43:49.532969 | orchestrator |  ] 2026-04-07 00:43:49.532975 | orchestrator |  } 2026-04-07 00:43:49.532982 | orchestrator | } 2026-04-07 00:43:49.532988 | orchestrator | 2026-04-07 00:43:49.532994 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:43:49.533000 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-04-07 00:43:49.533007 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-04-07 00:43:49.533013 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-04-07 00:43:49.533019 | orchestrator | 2026-04-07 00:43:49.533025 | orchestrator | 2026-04-07 00:43:49.533031 | orchestrator | 2026-04-07 00:43:49.533037 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:43:49.533044 | orchestrator | Tuesday 07 April 2026 00:43:49 +0000 (0:00:00.123) 0:01:04.980 ********* 2026-04-07 00:43:49.533050 | orchestrator | =============================================================================== 2026-04-07 00:43:49.533056 | orchestrator | Create block VGs -------------------------------------------------------- 5.68s 2026-04-07 00:43:49.533062 | orchestrator | Create block LVs -------------------------------------------------------- 4.07s 2026-04-07 00:43:49.533068 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 1.78s 2026-04-07 00:43:49.533074 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.55s 2026-04-07 00:43:49.533080 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.55s 2026-04-07 00:43:49.533086 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.50s 2026-04-07 00:43:49.533092 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.47s 2026-04-07 00:43:49.533098 | orchestrator | Add known partitions to the list of available block devices ------------- 1.36s 2026-04-07 00:43:49.533109 | orchestrator | Add known links to the list of available block devices ------------------ 1.09s 2026-04-07 00:43:49.796815 | orchestrator | Add known partitions to the list of available block devices ------------- 0.94s 2026-04-07 00:43:49.796912 | orchestrator | Add known partitions to the list of available block devices ------------- 0.81s 2026-04-07 00:43:49.796928 | orchestrator | Print LVM report data --------------------------------------------------- 0.76s 2026-04-07 00:43:49.796940 | orchestrator | Add known links to the list of available block devices ------------------ 0.71s 2026-04-07 00:43:49.796952 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.67s 2026-04-07 00:43:49.796988 | orchestrator | Create dict of block VGs -> PVs from ceph_osd_devices ------------------- 0.66s 2026-04-07 00:43:49.796996 | orchestrator | Add known partitions to the list of available block devices ------------- 0.63s 2026-04-07 00:43:49.797006 | orchestrator | Get initial list of available block devices ----------------------------- 0.61s 2026-04-07 00:43:49.797032 | orchestrator | Create DB+WAL VGs ------------------------------------------------------- 0.61s 2026-04-07 00:43:49.797044 | orchestrator | Fail if DB LV defined in lvm_volumes is missing ------------------------- 0.60s 2026-04-07 00:43:49.797056 | orchestrator | Print 'Create DB VGs' --------------------------------------------------- 0.60s 2026-04-07 00:44:01.339301 | orchestrator | 2026-04-07 00:44:01 | INFO  | Prepare task for execution of facts. 2026-04-07 00:44:01.415935 | orchestrator | 2026-04-07 00:44:01 | INFO  | Task b816409f-dcc3-470b-895e-1b06158268bf (facts) was prepared for execution. 2026-04-07 00:44:01.416013 | orchestrator | 2026-04-07 00:44:01 | INFO  | It takes a moment until task b816409f-dcc3-470b-895e-1b06158268bf (facts) has been started and output is visible here. 2026-04-07 00:44:13.072348 | orchestrator | 2026-04-07 00:44:13.072492 | orchestrator | PLAY [Apply role facts] ******************************************************** 2026-04-07 00:44:13.072510 | orchestrator | 2026-04-07 00:44:13.072520 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-04-07 00:44:13.072530 | orchestrator | Tuesday 07 April 2026 00:44:04 +0000 (0:00:00.352) 0:00:00.352 ********* 2026-04-07 00:44:13.072538 | orchestrator | ok: [testbed-manager] 2026-04-07 00:44:13.072548 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:44:13.072556 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:44:13.072564 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:44:13.072572 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:44:13.072581 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:44:13.072589 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:44:13.072597 | orchestrator | 2026-04-07 00:44:13.072605 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-04-07 00:44:13.072613 | orchestrator | Tuesday 07 April 2026 00:44:05 +0000 (0:00:01.315) 0:00:01.667 ********* 2026-04-07 00:44:13.072622 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:13.072631 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:44:13.072639 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:44:13.072647 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:44:13.072655 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:44:13.072663 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:44:13.072671 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:44:13.072679 | orchestrator | 2026-04-07 00:44:13.072688 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-07 00:44:13.072696 | orchestrator | 2026-04-07 00:44:13.072704 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-07 00:44:13.072712 | orchestrator | Tuesday 07 April 2026 00:44:07 +0000 (0:00:01.167) 0:00:02.835 ********* 2026-04-07 00:44:13.072720 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:44:13.072728 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:44:13.072736 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:44:13.072744 | orchestrator | ok: [testbed-manager] 2026-04-07 00:44:13.072752 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:44:13.072760 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:44:13.072768 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:44:13.072776 | orchestrator | 2026-04-07 00:44:13.072785 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-04-07 00:44:13.072793 | orchestrator | 2026-04-07 00:44:13.072801 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-04-07 00:44:13.072809 | orchestrator | Tuesday 07 April 2026 00:44:12 +0000 (0:00:05.067) 0:00:07.903 ********* 2026-04-07 00:44:13.072817 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:13.072825 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:44:13.072833 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:44:13.072863 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:44:13.072872 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:44:13.072880 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:44:13.072888 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:44:13.072898 | orchestrator | 2026-04-07 00:44:13.072908 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:44:13.072917 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072928 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072937 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072947 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072957 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072966 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072976 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:44:13.072986 | orchestrator | 2026-04-07 00:44:13.072995 | orchestrator | 2026-04-07 00:44:13.073005 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:44:13.073015 | orchestrator | Tuesday 07 April 2026 00:44:12 +0000 (0:00:00.523) 0:00:08.426 ********* 2026-04-07 00:44:13.073025 | orchestrator | =============================================================================== 2026-04-07 00:44:13.073037 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.07s 2026-04-07 00:44:13.073051 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.32s 2026-04-07 00:44:13.073082 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.17s 2026-04-07 00:44:13.073097 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.52s 2026-04-07 00:44:24.562229 | orchestrator | 2026-04-07 00:44:24 | INFO  | Prepare task for execution of frr. 2026-04-07 00:44:24.632721 | orchestrator | 2026-04-07 00:44:24 | INFO  | Task bef64962-92b9-49c4-a9ec-4c5593a48cd8 (frr) was prepared for execution. 2026-04-07 00:44:24.632819 | orchestrator | 2026-04-07 00:44:24 | INFO  | It takes a moment until task bef64962-92b9-49c4-a9ec-4c5593a48cd8 (frr) has been started and output is visible here. 2026-04-07 00:44:47.511221 | orchestrator | 2026-04-07 00:44:47.511330 | orchestrator | PLAY [Apply role frr] ********************************************************** 2026-04-07 00:44:47.511349 | orchestrator | 2026-04-07 00:44:47.511362 | orchestrator | TASK [osism.services.frr : Include distribution specific install tasks] ******** 2026-04-07 00:44:47.511374 | orchestrator | Tuesday 07 April 2026 00:44:27 +0000 (0:00:00.265) 0:00:00.265 ********* 2026-04-07 00:44:47.511386 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/frr/tasks/install-Debian-family.yml for testbed-manager 2026-04-07 00:44:47.511432 | orchestrator | 2026-04-07 00:44:47.511444 | orchestrator | TASK [osism.services.frr : Pin frr package version] **************************** 2026-04-07 00:44:47.511456 | orchestrator | Tuesday 07 April 2026 00:44:27 +0000 (0:00:00.197) 0:00:00.462 ********* 2026-04-07 00:44:47.511467 | orchestrator | changed: [testbed-manager] 2026-04-07 00:44:47.511479 | orchestrator | 2026-04-07 00:44:47.511490 | orchestrator | TASK [osism.services.frr : Install frr package] ******************************** 2026-04-07 00:44:47.511501 | orchestrator | Tuesday 07 April 2026 00:44:29 +0000 (0:00:01.458) 0:00:01.921 ********* 2026-04-07 00:44:47.511533 | orchestrator | changed: [testbed-manager] 2026-04-07 00:44:47.511545 | orchestrator | 2026-04-07 00:44:47.511556 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/vtysh.conf] ********************* 2026-04-07 00:44:47.511567 | orchestrator | Tuesday 07 April 2026 00:44:38 +0000 (0:00:09.268) 0:00:11.189 ********* 2026-04-07 00:44:47.511578 | orchestrator | ok: [testbed-manager] 2026-04-07 00:44:47.511590 | orchestrator | 2026-04-07 00:44:47.511601 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/daemons] ************************ 2026-04-07 00:44:47.511613 | orchestrator | Tuesday 07 April 2026 00:44:39 +0000 (0:00:00.886) 0:00:12.076 ********* 2026-04-07 00:44:47.511624 | orchestrator | changed: [testbed-manager] 2026-04-07 00:44:47.511635 | orchestrator | 2026-04-07 00:44:47.511647 | orchestrator | TASK [osism.services.frr : Set _frr_uplinks fact] ****************************** 2026-04-07 00:44:47.511658 | orchestrator | Tuesday 07 April 2026 00:44:40 +0000 (0:00:00.842) 0:00:12.919 ********* 2026-04-07 00:44:47.511668 | orchestrator | ok: [testbed-manager] 2026-04-07 00:44:47.511680 | orchestrator | 2026-04-07 00:44:47.511691 | orchestrator | TASK [osism.services.frr : Write frr_config_template to temporary file] ******** 2026-04-07 00:44:47.511702 | orchestrator | Tuesday 07 April 2026 00:44:41 +0000 (0:00:01.071) 0:00:13.990 ********* 2026-04-07 00:44:47.511713 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:47.511724 | orchestrator | 2026-04-07 00:44:47.511735 | orchestrator | TASK [osism.services.frr : Render frr.conf from frr_config_template variable] *** 2026-04-07 00:44:47.511746 | orchestrator | Tuesday 07 April 2026 00:44:41 +0000 (0:00:00.126) 0:00:14.117 ********* 2026-04-07 00:44:47.511757 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:47.511768 | orchestrator | 2026-04-07 00:44:47.511779 | orchestrator | TASK [osism.services.frr : Remove temporary frr_config_template file] ********** 2026-04-07 00:44:47.511790 | orchestrator | Tuesday 07 April 2026 00:44:41 +0000 (0:00:00.216) 0:00:14.333 ********* 2026-04-07 00:44:47.511801 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:47.511812 | orchestrator | 2026-04-07 00:44:47.511823 | orchestrator | TASK [osism.services.frr : Check for frr.conf file in the configuration repository] *** 2026-04-07 00:44:47.511835 | orchestrator | Tuesday 07 April 2026 00:44:41 +0000 (0:00:00.144) 0:00:14.477 ********* 2026-04-07 00:44:47.511846 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:47.511857 | orchestrator | 2026-04-07 00:44:47.511868 | orchestrator | TASK [osism.services.frr : Copy frr.conf file from the configuration repository] *** 2026-04-07 00:44:47.511879 | orchestrator | Tuesday 07 April 2026 00:44:41 +0000 (0:00:00.121) 0:00:14.599 ********* 2026-04-07 00:44:47.511890 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:44:47.511901 | orchestrator | 2026-04-07 00:44:47.511912 | orchestrator | TASK [osism.services.frr : Copy default frr.conf file of type k3s_cilium] ****** 2026-04-07 00:44:47.511923 | orchestrator | Tuesday 07 April 2026 00:44:42 +0000 (0:00:00.143) 0:00:14.742 ********* 2026-04-07 00:44:47.511934 | orchestrator | changed: [testbed-manager] 2026-04-07 00:44:47.511945 | orchestrator | 2026-04-07 00:44:47.511956 | orchestrator | TASK [osism.services.frr : Set sysctl parameters] ****************************** 2026-04-07 00:44:47.511967 | orchestrator | Tuesday 07 April 2026 00:44:42 +0000 (0:00:00.876) 0:00:15.619 ********* 2026-04-07 00:44:47.511978 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.ip_forward', 'value': 1}) 2026-04-07 00:44:47.511989 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.send_redirects', 'value': 0}) 2026-04-07 00:44:47.512001 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.accept_redirects', 'value': 0}) 2026-04-07 00:44:47.512012 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.fib_multipath_hash_policy', 'value': 1}) 2026-04-07 00:44:47.512023 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.default.ignore_routes_with_linkdown', 'value': 1}) 2026-04-07 00:44:47.512035 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.rp_filter', 'value': 2}) 2026-04-07 00:44:47.512046 | orchestrator | 2026-04-07 00:44:47.512057 | orchestrator | TASK [osism.services.frr : Manage frr service] ********************************* 2026-04-07 00:44:47.512075 | orchestrator | Tuesday 07 April 2026 00:44:44 +0000 (0:00:02.071) 0:00:17.690 ********* 2026-04-07 00:44:47.512086 | orchestrator | ok: [testbed-manager] 2026-04-07 00:44:47.512097 | orchestrator | 2026-04-07 00:44:47.512108 | orchestrator | RUNNING HANDLER [osism.services.frr : Restart frr service] ********************* 2026-04-07 00:44:47.512119 | orchestrator | Tuesday 07 April 2026 00:44:46 +0000 (0:00:01.124) 0:00:18.815 ********* 2026-04-07 00:44:47.512130 | orchestrator | changed: [testbed-manager] 2026-04-07 00:44:47.512141 | orchestrator | 2026-04-07 00:44:47.512152 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:44:47.512164 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2026-04-07 00:44:47.512175 | orchestrator | 2026-04-07 00:44:47.512186 | orchestrator | 2026-04-07 00:44:47.512214 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:44:47.512226 | orchestrator | Tuesday 07 April 2026 00:44:47 +0000 (0:00:01.188) 0:00:20.003 ********* 2026-04-07 00:44:47.512237 | orchestrator | =============================================================================== 2026-04-07 00:44:47.512248 | orchestrator | osism.services.frr : Install frr package -------------------------------- 9.27s 2026-04-07 00:44:47.512260 | orchestrator | osism.services.frr : Set sysctl parameters ------------------------------ 2.07s 2026-04-07 00:44:47.512270 | orchestrator | osism.services.frr : Pin frr package version ---------------------------- 1.46s 2026-04-07 00:44:47.512281 | orchestrator | osism.services.frr : Restart frr service -------------------------------- 1.19s 2026-04-07 00:44:47.512304 | orchestrator | osism.services.frr : Manage frr service --------------------------------- 1.12s 2026-04-07 00:44:47.512315 | orchestrator | osism.services.frr : Set _frr_uplinks fact ------------------------------ 1.07s 2026-04-07 00:44:47.512326 | orchestrator | osism.services.frr : Copy file: /etc/frr/vtysh.conf --------------------- 0.89s 2026-04-07 00:44:47.512337 | orchestrator | osism.services.frr : Copy default frr.conf file of type k3s_cilium ------ 0.88s 2026-04-07 00:44:47.512347 | orchestrator | osism.services.frr : Copy file: /etc/frr/daemons ------------------------ 0.84s 2026-04-07 00:44:47.512358 | orchestrator | osism.services.frr : Render frr.conf from frr_config_template variable --- 0.22s 2026-04-07 00:44:47.512369 | orchestrator | osism.services.frr : Include distribution specific install tasks -------- 0.20s 2026-04-07 00:44:47.512380 | orchestrator | osism.services.frr : Remove temporary frr_config_template file ---------- 0.14s 2026-04-07 00:44:47.512410 | orchestrator | osism.services.frr : Copy frr.conf file from the configuration repository --- 0.14s 2026-04-07 00:44:47.512422 | orchestrator | osism.services.frr : Write frr_config_template to temporary file -------- 0.13s 2026-04-07 00:44:47.512433 | orchestrator | osism.services.frr : Check for frr.conf file in the configuration repository --- 0.12s 2026-04-07 00:44:47.632534 | orchestrator | 2026-04-07 00:44:47.635009 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Tue Apr 7 00:44:47 UTC 2026 2026-04-07 00:44:47.635074 | orchestrator | 2026-04-07 00:44:48.661301 | orchestrator | 2026-04-07 00:44:48 | INFO  | Collection nutshell is prepared for execution 2026-04-07 00:44:48.764800 | orchestrator | 2026-04-07 00:44:48 | INFO  | A [0] - dotfiles 2026-04-07 00:44:58.810782 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - homer 2026-04-07 00:44:58.810891 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - netdata 2026-04-07 00:44:58.810906 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - openstackclient 2026-04-07 00:44:58.810919 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - phpmyadmin 2026-04-07 00:44:58.811282 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - common 2026-04-07 00:44:58.815924 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- loadbalancer 2026-04-07 00:44:58.815972 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [2] --- opensearch 2026-04-07 00:44:58.816268 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [2] --- mariadb-ng 2026-04-07 00:44:58.816678 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [3] ---- horizon 2026-04-07 00:44:58.816708 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [3] ---- keystone 2026-04-07 00:44:58.816967 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- neutron 2026-04-07 00:44:58.817234 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [5] ------ wait-for-nova 2026-04-07 00:44:58.817454 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [6] ------- octavia 2026-04-07 00:44:58.818829 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- barbican 2026-04-07 00:44:58.818955 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- designate 2026-04-07 00:44:58.818975 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- ironic 2026-04-07 00:44:58.819267 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- placement 2026-04-07 00:44:58.819288 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- magnum 2026-04-07 00:44:58.820876 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- openvswitch 2026-04-07 00:44:58.821256 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [2] --- ovn 2026-04-07 00:44:58.821275 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- memcached 2026-04-07 00:44:58.821458 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- redis 2026-04-07 00:44:58.821614 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- rabbitmq-ng 2026-04-07 00:44:58.821910 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - kubernetes 2026-04-07 00:44:58.824847 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- kubeconfig 2026-04-07 00:44:58.824895 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- copy-kubeconfig 2026-04-07 00:44:58.824910 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [0] - ceph 2026-04-07 00:44:58.827605 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [1] -- ceph-pools 2026-04-07 00:44:58.827678 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [2] --- copy-ceph-keys 2026-04-07 00:44:58.827700 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [3] ---- cephclient 2026-04-07 00:44:58.827718 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- ceph-bootstrap-dashboard 2026-04-07 00:44:58.827736 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- wait-for-keystone 2026-04-07 00:44:58.827889 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [5] ------ kolla-ceph-rgw 2026-04-07 00:44:58.828209 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [5] ------ glance 2026-04-07 00:44:58.828247 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [5] ------ cinder 2026-04-07 00:44:58.828267 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [5] ------ nova 2026-04-07 00:44:58.828606 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [4] ----- prometheus 2026-04-07 00:44:58.828640 | orchestrator | 2026-04-07 00:44:58 | INFO  | A [5] ------ grafana 2026-04-07 00:44:59.047910 | orchestrator | 2026-04-07 00:44:59 | INFO  | All tasks of the collection nutshell are prepared for execution 2026-04-07 00:44:59.048153 | orchestrator | 2026-04-07 00:44:59 | INFO  | Tasks are running in the background 2026-04-07 00:45:00.926014 | orchestrator | 2026-04-07 00:45:00 | INFO  | No task IDs specified, wait for all currently running tasks 2026-04-07 00:45:03.143304 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:03.143601 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:03.144418 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:03.146336 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:03.148268 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:03.148864 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:03.152738 | orchestrator | 2026-04-07 00:45:03 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state STARTED 2026-04-07 00:45:03.152797 | orchestrator | 2026-04-07 00:45:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:06.219713 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:06.220091 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:06.220710 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:06.221556 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:06.222248 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:06.223498 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:06.224024 | orchestrator | 2026-04-07 00:45:06 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state STARTED 2026-04-07 00:45:06.224067 | orchestrator | 2026-04-07 00:45:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:09.259247 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:09.260452 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:09.260961 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:09.261593 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:09.262329 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:09.262987 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:09.263937 | orchestrator | 2026-04-07 00:45:09 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state STARTED 2026-04-07 00:45:09.263962 | orchestrator | 2026-04-07 00:45:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:12.387880 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:12.387977 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:12.388000 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:12.392785 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:12.393256 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:12.393926 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:12.395786 | orchestrator | 2026-04-07 00:45:12 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state STARTED 2026-04-07 00:45:12.395888 | orchestrator | 2026-04-07 00:45:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:15.451082 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:15.451298 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:15.451948 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:15.452590 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:15.453163 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:15.457552 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:15.460509 | orchestrator | 2026-04-07 00:45:15 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state STARTED 2026-04-07 00:45:15.460551 | orchestrator | 2026-04-07 00:45:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:18.507066 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:18.507499 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:18.508074 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:18.508687 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:18.511638 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:18.511930 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:18.516505 | orchestrator | 2026-04-07 00:45:18 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state STARTED 2026-04-07 00:45:18.516567 | orchestrator | 2026-04-07 00:45:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:21.865512 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:21.865596 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:21.865605 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:21.865613 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:21.865620 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:21.871643 | orchestrator | 2026-04-07 00:45:21.871741 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2026-04-07 00:45:21.871758 | orchestrator | 2026-04-07 00:45:21.871771 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2026-04-07 00:45:21.871783 | orchestrator | Tuesday 07 April 2026 00:45:08 +0000 (0:00:00.419) 0:00:00.419 ********* 2026-04-07 00:45:21.871794 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:45:21.871807 | orchestrator | changed: [testbed-manager] 2026-04-07 00:45:21.871818 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:45:21.871829 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:45:21.871840 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:45:21.871851 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:45:21.871862 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:45:21.871901 | orchestrator | 2026-04-07 00:45:21.871922 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2026-04-07 00:45:21.871934 | orchestrator | Tuesday 07 April 2026 00:45:13 +0000 (0:00:04.876) 0:00:05.295 ********* 2026-04-07 00:45:21.871945 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2026-04-07 00:45:21.871957 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2026-04-07 00:45:21.871968 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2026-04-07 00:45:21.871979 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2026-04-07 00:45:21.871990 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2026-04-07 00:45:21.872000 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2026-04-07 00:45:21.872011 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2026-04-07 00:45:21.872022 | orchestrator | 2026-04-07 00:45:21.872033 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2026-04-07 00:45:21.872044 | orchestrator | Tuesday 07 April 2026 00:45:15 +0000 (0:00:01.889) 0:00:07.185 ********* 2026-04-07 00:45:21.872060 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:14.165675', 'end': '2026-04-07 00:45:14.172259', 'delta': '0:00:00.006584', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872076 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:14.338327', 'end': '2026-04-07 00:45:14.347876', 'delta': '0:00:00.009549', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872088 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:15.171173', 'end': '2026-04-07 00:45:15.176447', 'delta': '0:00:00.005274', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872137 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:14.219699', 'end': '2026-04-07 00:45:14.227895', 'delta': '0:00:00.008196', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872173 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:15.085302', 'end': '2026-04-07 00:45:15.093435', 'delta': '0:00:00.008133', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872187 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:14.214402', 'end': '2026-04-07 00:45:14.226084', 'delta': '0:00:00.011682', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872201 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-07 00:45:14.227954', 'end': '2026-04-07 00:45:14.234938', 'delta': '0:00:00.006984', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-07 00:45:21.872214 | orchestrator | 2026-04-07 00:45:21.872228 | orchestrator | TASK [geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist.] **** 2026-04-07 00:45:21.872241 | orchestrator | Tuesday 07 April 2026 00:45:16 +0000 (0:00:01.376) 0:00:08.561 ********* 2026-04-07 00:45:21.872254 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2026-04-07 00:45:21.872268 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2026-04-07 00:45:21.872280 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2026-04-07 00:45:21.872293 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2026-04-07 00:45:21.872306 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2026-04-07 00:45:21.872318 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2026-04-07 00:45:21.872331 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2026-04-07 00:45:21.872343 | orchestrator | 2026-04-07 00:45:21.872355 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2026-04-07 00:45:21.872368 | orchestrator | Tuesday 07 April 2026 00:45:17 +0000 (0:00:01.116) 0:00:09.678 ********* 2026-04-07 00:45:21.872425 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2026-04-07 00:45:21.872446 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2026-04-07 00:45:21.872464 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2026-04-07 00:45:21.872482 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2026-04-07 00:45:21.872499 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2026-04-07 00:45:21.872517 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2026-04-07 00:45:21.872536 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2026-04-07 00:45:21.872547 | orchestrator | 2026-04-07 00:45:21.872559 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:45:21.872580 | orchestrator | testbed-manager : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872593 | orchestrator | testbed-node-0 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872605 | orchestrator | testbed-node-1 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872616 | orchestrator | testbed-node-2 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872627 | orchestrator | testbed-node-3 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872638 | orchestrator | testbed-node-4 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872649 | orchestrator | testbed-node-5 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:45:21.872660 | orchestrator | 2026-04-07 00:45:21.872671 | orchestrator | 2026-04-07 00:45:21.872690 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:45:21.872707 | orchestrator | Tuesday 07 April 2026 00:45:20 +0000 (0:00:02.407) 0:00:12.085 ********* 2026-04-07 00:45:21.872723 | orchestrator | =============================================================================== 2026-04-07 00:45:21.872739 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 4.88s 2026-04-07 00:45:21.872755 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 2.41s 2026-04-07 00:45:21.872772 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 1.89s 2026-04-07 00:45:21.872788 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 1.38s 2026-04-07 00:45:21.872805 | orchestrator | geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist. ---- 1.12s 2026-04-07 00:45:21.872820 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:21.872836 | orchestrator | 2026-04-07 00:45:21 | INFO  | Task 11115e85-8b0c-40e3-8ca1-7ef820053bed is in state SUCCESS 2026-04-07 00:45:21.873171 | orchestrator | 2026-04-07 00:45:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:25.239932 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:25.240532 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:25.241734 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:25.242932 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:25.243808 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:25.247335 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:25.247805 | orchestrator | 2026-04-07 00:45:25 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:25.247839 | orchestrator | 2026-04-07 00:45:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:28.355848 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:28.355941 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:28.355954 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:28.355964 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:28.355973 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:28.356000 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:28.356010 | orchestrator | 2026-04-07 00:45:28 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:28.356019 | orchestrator | 2026-04-07 00:45:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:31.537905 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:31.545604 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:31.547067 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:31.548021 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:31.549141 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:31.551179 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:31.551926 | orchestrator | 2026-04-07 00:45:31 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:31.552102 | orchestrator | 2026-04-07 00:45:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:34.586083 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:34.588731 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:34.591142 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:34.591207 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:34.591217 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:34.591226 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:34.591329 | orchestrator | 2026-04-07 00:45:34 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:34.591343 | orchestrator | 2026-04-07 00:45:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:37.639763 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:37.641714 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:37.643259 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:37.643889 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:37.645320 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:37.646170 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:37.646950 | orchestrator | 2026-04-07 00:45:37 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:37.647017 | orchestrator | 2026-04-07 00:45:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:40.684131 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:40.686337 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:40.688960 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:40.690658 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:40.693174 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:40.696084 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:40.697651 | orchestrator | 2026-04-07 00:45:40 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:40.699027 | orchestrator | 2026-04-07 00:45:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:43.916141 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:43.916670 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:43.916741 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:43.917278 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:43.918721 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:43.920024 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:43.920108 | orchestrator | 2026-04-07 00:45:43 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:43.920131 | orchestrator | 2026-04-07 00:45:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:47.024893 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:47.024997 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:47.025013 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:47.025025 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:47.025036 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state STARTED 2026-04-07 00:45:47.025072 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:47.025083 | orchestrator | 2026-04-07 00:45:46 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:47.025095 | orchestrator | 2026-04-07 00:45:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:50.085749 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:50.086497 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:50.087120 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:50.088488 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:50.088996 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task 479fd4b1-7d6b-433c-939b-72a4bc9420ce is in state SUCCESS 2026-04-07 00:45:50.092902 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:50.095274 | orchestrator | 2026-04-07 00:45:50 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:50.095320 | orchestrator | 2026-04-07 00:45:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:53.199000 | orchestrator | 2026-04-07 00:45:53 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:53.199145 | orchestrator | 2026-04-07 00:45:53 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:53.199175 | orchestrator | 2026-04-07 00:45:53 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:53.200609 | orchestrator | 2026-04-07 00:45:53 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:53.201068 | orchestrator | 2026-04-07 00:45:53 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state STARTED 2026-04-07 00:45:53.203553 | orchestrator | 2026-04-07 00:45:53 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:53.203604 | orchestrator | 2026-04-07 00:45:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:56.244872 | orchestrator | 2026-04-07 00:45:56 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:56.245223 | orchestrator | 2026-04-07 00:45:56 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:56.247863 | orchestrator | 2026-04-07 00:45:56 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:56.250280 | orchestrator | 2026-04-07 00:45:56 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:56.250994 | orchestrator | 2026-04-07 00:45:56 | INFO  | Task 2d34cdf0-3565-456d-a096-56750fe08787 is in state SUCCESS 2026-04-07 00:45:56.253475 | orchestrator | 2026-04-07 00:45:56 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:56.253548 | orchestrator | 2026-04-07 00:45:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:45:59.304333 | orchestrator | 2026-04-07 00:45:59 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:45:59.304518 | orchestrator | 2026-04-07 00:45:59 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:45:59.304535 | orchestrator | 2026-04-07 00:45:59 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:45:59.304575 | orchestrator | 2026-04-07 00:45:59 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:45:59.304586 | orchestrator | 2026-04-07 00:45:59 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:45:59.304596 | orchestrator | 2026-04-07 00:45:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:02.363806 | orchestrator | 2026-04-07 00:46:02 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:02.366539 | orchestrator | 2026-04-07 00:46:02 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:02.372652 | orchestrator | 2026-04-07 00:46:02 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:02.374944 | orchestrator | 2026-04-07 00:46:02 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:46:02.376745 | orchestrator | 2026-04-07 00:46:02 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:02.376800 | orchestrator | 2026-04-07 00:46:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:05.423540 | orchestrator | 2026-04-07 00:46:05 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:05.425533 | orchestrator | 2026-04-07 00:46:05 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:05.425565 | orchestrator | 2026-04-07 00:46:05 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:05.426473 | orchestrator | 2026-04-07 00:46:05 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:46:05.428311 | orchestrator | 2026-04-07 00:46:05 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:05.428426 | orchestrator | 2026-04-07 00:46:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:08.478561 | orchestrator | 2026-04-07 00:46:08 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:08.480487 | orchestrator | 2026-04-07 00:46:08 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:08.480569 | orchestrator | 2026-04-07 00:46:08 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:08.480584 | orchestrator | 2026-04-07 00:46:08 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state STARTED 2026-04-07 00:46:08.481035 | orchestrator | 2026-04-07 00:46:08 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:08.481061 | orchestrator | 2026-04-07 00:46:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:11.516047 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:11.518858 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:11.520041 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:11.520641 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:11.521596 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:11.522568 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:11.530795 | orchestrator | 2026-04-07 00:46:11.530898 | orchestrator | 2026-04-07 00:46:11.530921 | orchestrator | PLAY [Apply role homer] ******************************************************** 2026-04-07 00:46:11.530974 | orchestrator | 2026-04-07 00:46:11.530992 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2026-04-07 00:46:11.531011 | orchestrator | Tuesday 07 April 2026 00:45:09 +0000 (0:00:00.888) 0:00:00.888 ********* 2026-04-07 00:46:11.531120 | orchestrator | ok: [testbed-manager] => { 2026-04-07 00:46:11.531142 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2026-04-07 00:46:11.531163 | orchestrator | } 2026-04-07 00:46:11.531231 | orchestrator | 2026-04-07 00:46:11.531250 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2026-04-07 00:46:11.531268 | orchestrator | Tuesday 07 April 2026 00:45:10 +0000 (0:00:00.669) 0:00:01.557 ********* 2026-04-07 00:46:11.531286 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:11.531307 | orchestrator | 2026-04-07 00:46:11.531326 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2026-04-07 00:46:11.531346 | orchestrator | Tuesday 07 April 2026 00:45:12 +0000 (0:00:02.802) 0:00:04.361 ********* 2026-04-07 00:46:11.531393 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2026-04-07 00:46:11.531414 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2026-04-07 00:46:11.531435 | orchestrator | 2026-04-07 00:46:11.531455 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2026-04-07 00:46:11.531475 | orchestrator | Tuesday 07 April 2026 00:45:15 +0000 (0:00:02.313) 0:00:06.674 ********* 2026-04-07 00:46:11.531495 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.531516 | orchestrator | 2026-04-07 00:46:11.531537 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2026-04-07 00:46:11.531558 | orchestrator | Tuesday 07 April 2026 00:45:17 +0000 (0:00:02.083) 0:00:08.757 ********* 2026-04-07 00:46:11.531578 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.531597 | orchestrator | 2026-04-07 00:46:11.531615 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2026-04-07 00:46:11.531633 | orchestrator | Tuesday 07 April 2026 00:45:18 +0000 (0:00:01.393) 0:00:10.150 ********* 2026-04-07 00:46:11.531650 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2026-04-07 00:46:11.531668 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:11.531687 | orchestrator | 2026-04-07 00:46:11.531707 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2026-04-07 00:46:11.531726 | orchestrator | Tuesday 07 April 2026 00:45:46 +0000 (0:00:28.273) 0:00:38.424 ********* 2026-04-07 00:46:11.531744 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.531763 | orchestrator | 2026-04-07 00:46:11.531780 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:11.531799 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:11.531818 | orchestrator | 2026-04-07 00:46:11.531837 | orchestrator | 2026-04-07 00:46:11.531855 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:11.531873 | orchestrator | Tuesday 07 April 2026 00:45:48 +0000 (0:00:02.090) 0:00:40.514 ********* 2026-04-07 00:46:11.531892 | orchestrator | =============================================================================== 2026-04-07 00:46:11.531910 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 28.27s 2026-04-07 00:46:11.531928 | orchestrator | osism.services.homer : Create traefik external network ------------------ 2.80s 2026-04-07 00:46:11.531946 | orchestrator | osism.services.homer : Create required directories ---------------------- 2.31s 2026-04-07 00:46:11.531966 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 2.09s 2026-04-07 00:46:11.531985 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 2.08s 2026-04-07 00:46:11.532005 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 1.39s 2026-04-07 00:46:11.532024 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.67s 2026-04-07 00:46:11.532062 | orchestrator | 2026-04-07 00:46:11.532082 | orchestrator | 2026-04-07 00:46:11.532101 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2026-04-07 00:46:11.532120 | orchestrator | 2026-04-07 00:46:11.532140 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2026-04-07 00:46:11.532160 | orchestrator | Tuesday 07 April 2026 00:45:08 +0000 (0:00:00.515) 0:00:00.515 ********* 2026-04-07 00:46:11.532179 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2026-04-07 00:46:11.532199 | orchestrator | 2026-04-07 00:46:11.532218 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2026-04-07 00:46:11.532236 | orchestrator | Tuesday 07 April 2026 00:45:09 +0000 (0:00:00.971) 0:00:01.487 ********* 2026-04-07 00:46:11.532255 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2026-04-07 00:46:11.532275 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2026-04-07 00:46:11.532295 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2026-04-07 00:46:11.532314 | orchestrator | 2026-04-07 00:46:11.532333 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2026-04-07 00:46:11.532351 | orchestrator | Tuesday 07 April 2026 00:45:12 +0000 (0:00:03.322) 0:00:04.810 ********* 2026-04-07 00:46:11.532395 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.532414 | orchestrator | 2026-04-07 00:46:11.532432 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2026-04-07 00:46:11.532450 | orchestrator | Tuesday 07 April 2026 00:45:14 +0000 (0:00:01.746) 0:00:06.557 ********* 2026-04-07 00:46:11.532492 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2026-04-07 00:46:11.532513 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:11.532531 | orchestrator | 2026-04-07 00:46:11.532549 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2026-04-07 00:46:11.532567 | orchestrator | Tuesday 07 April 2026 00:45:48 +0000 (0:00:33.322) 0:00:39.879 ********* 2026-04-07 00:46:11.532586 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.532605 | orchestrator | 2026-04-07 00:46:11.532637 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2026-04-07 00:46:11.532657 | orchestrator | Tuesday 07 April 2026 00:45:49 +0000 (0:00:01.565) 0:00:41.445 ********* 2026-04-07 00:46:11.532675 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:11.532692 | orchestrator | 2026-04-07 00:46:11.532796 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2026-04-07 00:46:11.532816 | orchestrator | Tuesday 07 April 2026 00:45:50 +0000 (0:00:00.755) 0:00:42.200 ********* 2026-04-07 00:46:11.532835 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.532852 | orchestrator | 2026-04-07 00:46:11.532870 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2026-04-07 00:46:11.532887 | orchestrator | Tuesday 07 April 2026 00:45:52 +0000 (0:00:01.807) 0:00:44.007 ********* 2026-04-07 00:46:11.532905 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.532922 | orchestrator | 2026-04-07 00:46:11.532937 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2026-04-07 00:46:11.532953 | orchestrator | Tuesday 07 April 2026 00:45:53 +0000 (0:00:00.906) 0:00:44.913 ********* 2026-04-07 00:46:11.532968 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.532983 | orchestrator | 2026-04-07 00:46:11.532998 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2026-04-07 00:46:11.533013 | orchestrator | Tuesday 07 April 2026 00:45:53 +0000 (0:00:00.487) 0:00:45.400 ********* 2026-04-07 00:46:11.533029 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:11.533045 | orchestrator | 2026-04-07 00:46:11.533061 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:11.533077 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:11.533111 | orchestrator | 2026-04-07 00:46:11.533127 | orchestrator | 2026-04-07 00:46:11.533143 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:11.533159 | orchestrator | Tuesday 07 April 2026 00:45:54 +0000 (0:00:00.590) 0:00:45.991 ********* 2026-04-07 00:46:11.533175 | orchestrator | =============================================================================== 2026-04-07 00:46:11.533190 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 33.32s 2026-04-07 00:46:11.533206 | orchestrator | osism.services.openstackclient : Create required directories ------------ 3.32s 2026-04-07 00:46:11.533223 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 1.81s 2026-04-07 00:46:11.533239 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 1.75s 2026-04-07 00:46:11.533320 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 1.57s 2026-04-07 00:46:11.533339 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.97s 2026-04-07 00:46:11.533358 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 0.91s 2026-04-07 00:46:11.533401 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 0.76s 2026-04-07 00:46:11.533416 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 0.59s 2026-04-07 00:46:11.533432 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 0.49s 2026-04-07 00:46:11.533448 | orchestrator | 2026-04-07 00:46:11.533466 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task 4c29e711-a06d-432f-8faf-e10a6144a64d is in state SUCCESS 2026-04-07 00:46:11.537115 | orchestrator | 2026-04-07 00:46:11.537174 | orchestrator | PLAY [Apply role common] ******************************************************* 2026-04-07 00:46:11.537183 | orchestrator | 2026-04-07 00:46:11.537191 | orchestrator | TASK [common : include_tasks] ************************************************** 2026-04-07 00:46:11.537198 | orchestrator | Tuesday 07 April 2026 00:45:02 +0000 (0:00:00.337) 0:00:00.337 ********* 2026-04-07 00:46:11.537206 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:46:11.537214 | orchestrator | 2026-04-07 00:46:11.537221 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2026-04-07 00:46:11.537228 | orchestrator | Tuesday 07 April 2026 00:45:03 +0000 (0:00:00.999) 0:00:01.337 ********* 2026-04-07 00:46:11.537235 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537242 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537249 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537256 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537263 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537269 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537276 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537283 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537289 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-07 00:46:11.537296 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537304 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537310 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537317 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537339 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537346 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537353 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537360 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537391 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-07 00:46:11.537398 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537405 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537413 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-07 00:46:11.537419 | orchestrator | 2026-04-07 00:46:11.537426 | orchestrator | TASK [common : include_tasks] ************************************************** 2026-04-07 00:46:11.537433 | orchestrator | Tuesday 07 April 2026 00:45:07 +0000 (0:00:03.382) 0:00:04.720 ********* 2026-04-07 00:46:11.537440 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:46:11.537448 | orchestrator | 2026-04-07 00:46:11.537455 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2026-04-07 00:46:11.537461 | orchestrator | Tuesday 07 April 2026 00:45:08 +0000 (0:00:01.249) 0:00:05.969 ********* 2026-04-07 00:46:11.537498 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537508 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537527 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537535 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537542 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537561 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537569 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.537577 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537584 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537597 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537604 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537617 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537632 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537647 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537654 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537661 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537673 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537680 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537688 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537701 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537712 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.537721 | orchestrator | 2026-04-07 00:46:11.537729 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2026-04-07 00:46:11.537737 | orchestrator | Tuesday 07 April 2026 00:45:12 +0000 (0:00:04.407) 0:00:10.376 ********* 2026-04-07 00:46:11.537746 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537754 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537763 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537779 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537789 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537801 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537809 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:46:11.537821 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537830 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537838 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537847 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537859 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537867 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537880 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:11.537888 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:11.537897 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537906 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:11.537914 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537925 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537934 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537942 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:11.537950 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537959 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.537977 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.537991 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:11.537999 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538008 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538060 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:11.538067 | orchestrator | 2026-04-07 00:46:11.538074 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2026-04-07 00:46:11.538082 | orchestrator | Tuesday 07 April 2026 00:45:17 +0000 (0:00:04.607) 0:00:14.984 ********* 2026-04-07 00:46:11.538089 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538096 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538104 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538111 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538129 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538136 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538147 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538158 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538166 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:11.538173 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538180 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:11.538188 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538195 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538206 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538214 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:46:11.538225 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538233 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538240 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538247 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:11.538260 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538267 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538275 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:11.538282 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.538294 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538301 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:11.538321 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538329 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.538336 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:11.538343 | orchestrator | 2026-04-07 00:46:11.538350 | orchestrator | TASK [common : Ensure /var/log/journal exists on EL10 systems] ***************** 2026-04-07 00:46:11.538357 | orchestrator | Tuesday 07 April 2026 00:45:21 +0000 (0:00:04.372) 0:00:19.356 ********* 2026-04-07 00:46:11.538410 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:46:11.538419 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:11.538426 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:11.538432 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:11.538439 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:11.538446 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:11.538453 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:11.538460 | orchestrator | 2026-04-07 00:46:11.538467 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2026-04-07 00:46:11.538473 | orchestrator | Tuesday 07 April 2026 00:45:22 +0000 (0:00:01.036) 0:00:20.393 ********* 2026-04-07 00:46:11.538480 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:46:11.538487 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:11.538494 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:11.538500 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:11.538507 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:11.538514 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:11.538521 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:11.538528 | orchestrator | 2026-04-07 00:46:11.538535 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2026-04-07 00:46:11.538546 | orchestrator | Tuesday 07 April 2026 00:45:23 +0000 (0:00:00.844) 0:00:21.237 ********* 2026-04-07 00:46:11.538553 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:46:11.538560 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:11.538567 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:11.538573 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:11.538580 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:11.538587 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:11.538594 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:11.538601 | orchestrator | 2026-04-07 00:46:11.538607 | orchestrator | TASK [common : Copying over kolla.target] ************************************** 2026-04-07 00:46:11.538620 | orchestrator | Tuesday 07 April 2026 00:45:25 +0000 (0:00:01.750) 0:00:22.987 ********* 2026-04-07 00:46:11.538627 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:11.538633 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:11.538640 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:11.538647 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:11.538654 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:11.538660 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.538667 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:11.538674 | orchestrator | 2026-04-07 00:46:11.538681 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2026-04-07 00:46:11.538687 | orchestrator | Tuesday 07 April 2026 00:45:27 +0000 (0:00:02.074) 0:00:25.062 ********* 2026-04-07 00:46:11.538695 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538702 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538722 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538730 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538738 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538745 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538760 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538767 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538775 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.538782 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538801 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538809 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538820 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538832 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538839 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538847 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538854 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538866 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538874 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538881 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538888 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.538900 | orchestrator | 2026-04-07 00:46:11.538907 | orchestrator | TASK [common : Find custom fluentd input config files] ************************* 2026-04-07 00:46:11.538914 | orchestrator | Tuesday 07 April 2026 00:45:33 +0000 (0:00:05.502) 0:00:30.565 ********* 2026-04-07 00:46:11.538921 | orchestrator | [WARNING]: Skipped 2026-04-07 00:46:11.538931 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' path due 2026-04-07 00:46:11.538939 | orchestrator | to this access issue: 2026-04-07 00:46:11.538946 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' is not a 2026-04-07 00:46:11.538953 | orchestrator | directory 2026-04-07 00:46:11.538960 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 00:46:11.538967 | orchestrator | 2026-04-07 00:46:11.538974 | orchestrator | TASK [common : Find custom fluentd filter config files] ************************ 2026-04-07 00:46:11.538981 | orchestrator | Tuesday 07 April 2026 00:45:34 +0000 (0:00:01.007) 0:00:31.572 ********* 2026-04-07 00:46:11.538987 | orchestrator | [WARNING]: Skipped 2026-04-07 00:46:11.538994 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' path due 2026-04-07 00:46:11.539001 | orchestrator | to this access issue: 2026-04-07 00:46:11.539008 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' is not a 2026-04-07 00:46:11.539015 | orchestrator | directory 2026-04-07 00:46:11.539022 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 00:46:11.539029 | orchestrator | 2026-04-07 00:46:11.539035 | orchestrator | TASK [common : Find custom fluentd format config files] ************************ 2026-04-07 00:46:11.539042 | orchestrator | Tuesday 07 April 2026 00:45:35 +0000 (0:00:01.013) 0:00:32.586 ********* 2026-04-07 00:46:11.539049 | orchestrator | [WARNING]: Skipped 2026-04-07 00:46:11.539056 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' path due 2026-04-07 00:46:11.539063 | orchestrator | to this access issue: 2026-04-07 00:46:11.539069 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' is not a 2026-04-07 00:46:11.539076 | orchestrator | directory 2026-04-07 00:46:11.539083 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 00:46:11.539090 | orchestrator | 2026-04-07 00:46:11.539097 | orchestrator | TASK [common : Find custom fluentd output config files] ************************ 2026-04-07 00:46:11.539103 | orchestrator | Tuesday 07 April 2026 00:45:36 +0000 (0:00:01.068) 0:00:33.654 ********* 2026-04-07 00:46:11.539110 | orchestrator | [WARNING]: Skipped 2026-04-07 00:46:11.539117 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' path due 2026-04-07 00:46:11.539124 | orchestrator | to this access issue: 2026-04-07 00:46:11.539131 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' is not a 2026-04-07 00:46:11.539137 | orchestrator | directory 2026-04-07 00:46:11.539144 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 00:46:11.539151 | orchestrator | 2026-04-07 00:46:11.539158 | orchestrator | TASK [common : Copying over fluentd.conf] ************************************** 2026-04-07 00:46:11.539165 | orchestrator | Tuesday 07 April 2026 00:45:36 +0000 (0:00:00.842) 0:00:34.496 ********* 2026-04-07 00:46:11.539172 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:11.539179 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.539186 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:11.539192 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:11.539199 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:11.539206 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:11.539213 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:11.539219 | orchestrator | 2026-04-07 00:46:11.539226 | orchestrator | TASK [common : Copying over cron logrotate config file] ************************ 2026-04-07 00:46:11.539238 | orchestrator | Tuesday 07 April 2026 00:45:41 +0000 (0:00:04.714) 0:00:39.211 ********* 2026-04-07 00:46:11.539244 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539255 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539263 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539270 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539277 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539283 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539290 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-07 00:46:11.539297 | orchestrator | 2026-04-07 00:46:11.539304 | orchestrator | TASK [common : Ensure RabbitMQ Erlang cookie exists] *************************** 2026-04-07 00:46:11.539311 | orchestrator | Tuesday 07 April 2026 00:45:44 +0000 (0:00:02.646) 0:00:41.858 ********* 2026-04-07 00:46:11.539317 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:11.539324 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:11.539331 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:11.539338 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.539345 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:11.539352 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:11.539358 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:11.539376 | orchestrator | 2026-04-07 00:46:11.539383 | orchestrator | TASK [common : Ensuring config directories have correct owner and permission] *** 2026-04-07 00:46:11.539390 | orchestrator | Tuesday 07 April 2026 00:45:47 +0000 (0:00:03.090) 0:00:44.948 ********* 2026-04-07 00:46:11.539401 | orchestrator | ok: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539409 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539417 | orchestrator | ok: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539424 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539438 | orchestrator | ok: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539451 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539458 | orchestrator | ok: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539466 | orchestrator | ok: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539478 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539486 | orchestrator | ok: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539494 | orchestrator | ok: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539505 | orchestrator | ok: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539516 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539524 | orchestrator | ok: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539532 | orchestrator | ok: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539539 | orchestrator | ok: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539552 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539560 | orchestrator | ok: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539572 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.539579 | orchestrator | ok: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539596 | orchestrator | ok: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539603 | orchestrator | 2026-04-07 00:46:11.539610 | orchestrator | TASK [common : Copy rabbitmq-env.conf to kolla toolbox] ************************ 2026-04-07 00:46:11.539617 | orchestrator | Tuesday 07 April 2026 00:45:51 +0000 (0:00:03.675) 0:00:48.624 ********* 2026-04-07 00:46:11.539624 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539631 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539638 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539645 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539652 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539659 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539665 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-07 00:46:11.539672 | orchestrator | 2026-04-07 00:46:11.539679 | orchestrator | TASK [common : Copy rabbitmq erl_inetrc to kolla toolbox] ********************** 2026-04-07 00:46:11.539686 | orchestrator | Tuesday 07 April 2026 00:45:53 +0000 (0:00:02.545) 0:00:51.170 ********* 2026-04-07 00:46:11.539693 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539700 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539707 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539714 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539721 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539728 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539738 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-07 00:46:11.539745 | orchestrator | 2026-04-07 00:46:11.539752 | orchestrator | TASK [service-check-containers : common | Check containers] ******************** 2026-04-07 00:46:11.539759 | orchestrator | Tuesday 07 April 2026 00:45:56 +0000 (0:00:02.644) 0:00:53.814 ********* 2026-04-07 00:46:11.539771 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539778 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539786 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539793 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539804 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539812 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539819 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-07 00:46:11.539830 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539843 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539850 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539857 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539869 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539876 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539887 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539899 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539906 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539913 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539921 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539928 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539939 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539947 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:46:11.539954 | orchestrator | 2026-04-07 00:46:11.539961 | orchestrator | TASK [service-check-containers : common | Notify handlers to restart containers] *** 2026-04-07 00:46:11.539968 | orchestrator | Tuesday 07 April 2026 00:45:59 +0000 (0:00:03.091) 0:00:56.905 ********* 2026-04-07 00:46:11.539975 | orchestrator | changed: [testbed-manager] => { 2026-04-07 00:46:11.539982 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.539989 | orchestrator | } 2026-04-07 00:46:11.539996 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:46:11.540007 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.540014 | orchestrator | } 2026-04-07 00:46:11.540021 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:46:11.540027 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.540034 | orchestrator | } 2026-04-07 00:46:11.540041 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:46:11.540048 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.540056 | orchestrator | } 2026-04-07 00:46:11.540063 | orchestrator | changed: [testbed-node-3] => { 2026-04-07 00:46:11.540070 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.540077 | orchestrator | } 2026-04-07 00:46:11.540083 | orchestrator | changed: [testbed-node-4] => { 2026-04-07 00:46:11.540090 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.540097 | orchestrator | } 2026-04-07 00:46:11.540104 | orchestrator | changed: [testbed-node-5] => { 2026-04-07 00:46:11.540111 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:11.540118 | orchestrator | } 2026-04-07 00:46:11.540124 | orchestrator | 2026-04-07 00:46:11.540135 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:46:11.540142 | orchestrator | Tuesday 07 April 2026 00:46:00 +0000 (0:00:00.775) 0:00:57.681 ********* 2026-04-07 00:46:11.540149 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540157 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540164 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540172 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540183 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540195 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540203 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540214 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540222 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:46:11.540229 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540236 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540244 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540255 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540262 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:11.540273 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:11.540281 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540288 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540299 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540306 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:11.540313 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:11.540320 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540327 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540335 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540342 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:11.540349 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-07 00:46:11.540378 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540386 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:46:11.540393 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:11.540400 | orchestrator | 2026-04-07 00:46:11.540407 | orchestrator | TASK [common : Creating log volume] ******************************************** 2026-04-07 00:46:11.540414 | orchestrator | Tuesday 07 April 2026 00:46:02 +0000 (0:00:02.089) 0:00:59.770 ********* 2026-04-07 00:46:11.540421 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.540428 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:11.540435 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:11.540442 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:11.540448 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:11.540455 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:11.540462 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:11.540469 | orchestrator | 2026-04-07 00:46:11.540476 | orchestrator | TASK [common : Link kolla_logs volume to /var/log/kolla] *********************** 2026-04-07 00:46:11.540482 | orchestrator | Tuesday 07 April 2026 00:46:04 +0000 (0:00:02.050) 0:01:01.820 ********* 2026-04-07 00:46:11.540489 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:11.540496 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:11.540503 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:11.540510 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:11.540517 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:11.540523 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:11.540530 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:11.540537 | orchestrator | 2026-04-07 00:46:11.540544 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540551 | orchestrator | Tuesday 07 April 2026 00:46:05 +0000 (0:00:01.365) 0:01:03.186 ********* 2026-04-07 00:46:11.540557 | orchestrator | 2026-04-07 00:46:11.540564 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540571 | orchestrator | Tuesday 07 April 2026 00:46:05 +0000 (0:00:00.089) 0:01:03.276 ********* 2026-04-07 00:46:11.540578 | orchestrator | 2026-04-07 00:46:11.540585 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540591 | orchestrator | Tuesday 07 April 2026 00:46:05 +0000 (0:00:00.065) 0:01:03.341 ********* 2026-04-07 00:46:11.540598 | orchestrator | 2026-04-07 00:46:11.540605 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540612 | orchestrator | Tuesday 07 April 2026 00:46:05 +0000 (0:00:00.064) 0:01:03.406 ********* 2026-04-07 00:46:11.540619 | orchestrator | 2026-04-07 00:46:11.540625 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540632 | orchestrator | Tuesday 07 April 2026 00:46:05 +0000 (0:00:00.064) 0:01:03.470 ********* 2026-04-07 00:46:11.540639 | orchestrator | 2026-04-07 00:46:11.540652 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540664 | orchestrator | Tuesday 07 April 2026 00:46:06 +0000 (0:00:00.066) 0:01:03.536 ********* 2026-04-07 00:46:11.540671 | orchestrator | 2026-04-07 00:46:11.540678 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-07 00:46:11.540685 | orchestrator | Tuesday 07 April 2026 00:46:06 +0000 (0:00:00.061) 0:01:03.598 ********* 2026-04-07 00:46:11.540692 | orchestrator | 2026-04-07 00:46:11.540698 | orchestrator | RUNNING HANDLER [common : Restart fluentd container] *************************** 2026-04-07 00:46:11.540705 | orchestrator | Tuesday 07 April 2026 00:46:06 +0000 (0:00:00.090) 0:01:03.688 ********* 2026-04-07 00:46:11.540720 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_1fvijnkl/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_1fvijnkl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_1fvijnkl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_1fvijnkl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540733 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_s5nw3boz/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_s5nw3boz/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_s5nw3boz/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_s5nw3boz/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540757 | orchestrator | fatal: [testbed-manager]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/py2026-04-07 00:46:11 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:11.540765 | orchestrator | 2026-04-07 00:46:11 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:11.540772 | orchestrator | 2026-04-07 00:46:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:11.540783 | orchestrator | thon3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_v4c96xuh/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_v4c96xuh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_v4c96xuh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_v4c96xuh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540803 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_pdnuyuyu/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_pdnuyuyu/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_pdnuyuyu/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_pdnuyuyu/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540816 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_hrrj50fw/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_hrrj50fw/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_hrrj50fw/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_hrrj50fw/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540838 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_7r4hgflg/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_7r4hgflg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_7r4hgflg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_7r4hgflg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540851 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_emjdof4t/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_emjdof4t/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_emjdof4t/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_emjdof4t/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Ffluentd: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:11.540864 | orchestrator | 2026-04-07 00:46:11.540871 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:11.540878 | orchestrator | testbed-manager : ok=20  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540886 | orchestrator | testbed-node-0 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540893 | orchestrator | testbed-node-1 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540900 | orchestrator | testbed-node-2 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540906 | orchestrator | testbed-node-3 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540913 | orchestrator | testbed-node-4 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540920 | orchestrator | testbed-node-5 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:46:11.540927 | orchestrator | 2026-04-07 00:46:11.540934 | orchestrator | 2026-04-07 00:46:11.540941 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:11.540948 | orchestrator | Tuesday 07 April 2026 00:46:09 +0000 (0:00:03.289) 0:01:06.978 ********* 2026-04-07 00:46:11.540955 | orchestrator | =============================================================================== 2026-04-07 00:46:11.540962 | orchestrator | common : Copying over config.json files for services -------------------- 5.50s 2026-04-07 00:46:11.540972 | orchestrator | common : Copying over fluentd.conf -------------------------------------- 4.71s 2026-04-07 00:46:11.540983 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 4.61s 2026-04-07 00:46:11.540990 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 4.41s 2026-04-07 00:46:11.540997 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 4.37s 2026-04-07 00:46:11.541004 | orchestrator | common : Ensuring config directories have correct owner and permission --- 3.68s 2026-04-07 00:46:11.541011 | orchestrator | common : Ensuring config directories exist ------------------------------ 3.38s 2026-04-07 00:46:11.541018 | orchestrator | common : Restart fluentd container -------------------------------------- 3.29s 2026-04-07 00:46:11.541024 | orchestrator | service-check-containers : common | Check containers -------------------- 3.09s 2026-04-07 00:46:11.541031 | orchestrator | common : Ensure RabbitMQ Erlang cookie exists --------------------------- 3.09s 2026-04-07 00:46:11.541038 | orchestrator | common : Copying over cron logrotate config file ------------------------ 2.65s 2026-04-07 00:46:11.541045 | orchestrator | common : Copy rabbitmq erl_inetrc to kolla toolbox ---------------------- 2.64s 2026-04-07 00:46:11.541051 | orchestrator | common : Copy rabbitmq-env.conf to kolla toolbox ------------------------ 2.54s 2026-04-07 00:46:11.541058 | orchestrator | service-check-containers : Include tasks -------------------------------- 2.09s 2026-04-07 00:46:11.541065 | orchestrator | common : Copying over kolla.target -------------------------------------- 2.08s 2026-04-07 00:46:11.541072 | orchestrator | common : Creating log volume -------------------------------------------- 2.05s 2026-04-07 00:46:11.541079 | orchestrator | common : Restart systemd-tmpfiles --------------------------------------- 1.75s 2026-04-07 00:46:11.541085 | orchestrator | common : Link kolla_logs volume to /var/log/kolla ----------------------- 1.37s 2026-04-07 00:46:11.541092 | orchestrator | common : include_tasks -------------------------------------------------- 1.25s 2026-04-07 00:46:11.541099 | orchestrator | common : Find custom fluentd format config files ------------------------ 1.07s 2026-04-07 00:46:14.579255 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:14.580214 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:14.580239 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:14.581392 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:14.581409 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:14.581878 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:14.583187 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:14.583918 | orchestrator | 2026-04-07 00:46:14 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:14.583949 | orchestrator | 2026-04-07 00:46:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:17.622305 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:17.623863 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:17.624600 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:17.625228 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:17.625944 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:17.629663 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:17.629978 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:17.630830 | orchestrator | 2026-04-07 00:46:17 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:17.630854 | orchestrator | 2026-04-07 00:46:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:20.688810 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:20.688878 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:20.688885 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:20.688903 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:20.688907 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:20.688911 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:20.688915 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:20.688919 | orchestrator | 2026-04-07 00:46:20 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:20.688924 | orchestrator | 2026-04-07 00:46:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:23.722958 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:23.839359 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:23.839467 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:23.839473 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:23.839478 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:23.839482 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:23.839486 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:23.839490 | orchestrator | 2026-04-07 00:46:23 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:23.839495 | orchestrator | 2026-04-07 00:46:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:26.998455 | orchestrator | 2026-04-07 00:46:26 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:26.998560 | orchestrator | 2026-04-07 00:46:26 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:26.998572 | orchestrator | 2026-04-07 00:46:26 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:26.998579 | orchestrator | 2026-04-07 00:46:26 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:26.998587 | orchestrator | 2026-04-07 00:46:26 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:26.999004 | orchestrator | 2026-04-07 00:46:26 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:27.010175 | orchestrator | 2026-04-07 00:46:27 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:27.010832 | orchestrator | 2026-04-07 00:46:27 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:27.011062 | orchestrator | 2026-04-07 00:46:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:30.049352 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:30.050592 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:30.050741 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:30.050828 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:30.052672 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state STARTED 2026-04-07 00:46:30.052730 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:30.052740 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:30.054004 | orchestrator | 2026-04-07 00:46:30 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:30.054082 | orchestrator | 2026-04-07 00:46:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:33.086147 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:33.087743 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state STARTED 2026-04-07 00:46:33.088823 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:33.089341 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state STARTED 2026-04-07 00:46:33.090200 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task b14f1c02-2d1e-4afe-a2e7-779d0deebfce is in state SUCCESS 2026-04-07 00:46:33.091112 | orchestrator | 2026-04-07 00:46:33.091140 | orchestrator | 2026-04-07 00:46:33.091153 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:46:33.091164 | orchestrator | 2026-04-07 00:46:33.091172 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:46:33.091182 | orchestrator | Tuesday 07 April 2026 00:46:16 +0000 (0:00:01.208) 0:00:01.208 ********* 2026-04-07 00:46:33.091191 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:33.091202 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:33.091211 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:33.091220 | orchestrator | 2026-04-07 00:46:33.091229 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:46:33.091238 | orchestrator | Tuesday 07 April 2026 00:46:17 +0000 (0:00:00.447) 0:00:01.655 ********* 2026-04-07 00:46:33.091249 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2026-04-07 00:46:33.091260 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2026-04-07 00:46:33.091269 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2026-04-07 00:46:33.091279 | orchestrator | 2026-04-07 00:46:33.091288 | orchestrator | PLAY [Apply role memcached] **************************************************** 2026-04-07 00:46:33.091297 | orchestrator | 2026-04-07 00:46:33.091307 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2026-04-07 00:46:33.091314 | orchestrator | Tuesday 07 April 2026 00:46:18 +0000 (0:00:01.148) 0:00:02.804 ********* 2026-04-07 00:46:33.091320 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:46:33.091356 | orchestrator | 2026-04-07 00:46:33.091400 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2026-04-07 00:46:33.091406 | orchestrator | Tuesday 07 April 2026 00:46:19 +0000 (0:00:01.154) 0:00:03.959 ********* 2026-04-07 00:46:33.091412 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2026-04-07 00:46:33.091419 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2026-04-07 00:46:33.091429 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2026-04-07 00:46:33.091442 | orchestrator | 2026-04-07 00:46:33.091451 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2026-04-07 00:46:33.091460 | orchestrator | Tuesday 07 April 2026 00:46:21 +0000 (0:00:02.195) 0:00:06.154 ********* 2026-04-07 00:46:33.091468 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2026-04-07 00:46:33.091478 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2026-04-07 00:46:33.091487 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2026-04-07 00:46:33.091496 | orchestrator | 2026-04-07 00:46:33.091506 | orchestrator | TASK [service-check-containers : memcached | Check containers] ***************** 2026-04-07 00:46:33.091513 | orchestrator | Tuesday 07 April 2026 00:46:24 +0000 (0:00:02.691) 0:00:08.846 ********* 2026-04-07 00:46:33.091526 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-07 00:46:33.091536 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-07 00:46:33.091567 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-07 00:46:33.091573 | orchestrator | 2026-04-07 00:46:33.091579 | orchestrator | TASK [service-check-containers : memcached | Notify handlers to restart containers] *** 2026-04-07 00:46:33.091584 | orchestrator | Tuesday 07 April 2026 00:46:26 +0000 (0:00:01.644) 0:00:10.490 ********* 2026-04-07 00:46:33.091590 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:46:33.091596 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:33.091602 | orchestrator | } 2026-04-07 00:46:33.091615 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:46:33.091621 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:33.091626 | orchestrator | } 2026-04-07 00:46:33.091632 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:46:33.091637 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:33.091643 | orchestrator | } 2026-04-07 00:46:33.091648 | orchestrator | 2026-04-07 00:46:33.091654 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:46:33.091659 | orchestrator | Tuesday 07 April 2026 00:46:26 +0000 (0:00:00.323) 0:00:10.814 ********* 2026-04-07 00:46:33.091665 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-07 00:46:33.091671 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:33.091677 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-07 00:46:33.091682 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:33.091688 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-07 00:46:33.091694 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:33.091699 | orchestrator | 2026-04-07 00:46:33.091705 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2026-04-07 00:46:33.091710 | orchestrator | Tuesday 07 April 2026 00:46:28 +0000 (0:00:02.280) 0:00:13.095 ********* 2026-04-07 00:46:33.091729 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmemcached\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_ysx15k6z/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_ysx15k6z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_ysx15k6z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_ysx15k6z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmemcached: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:33.091751 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmemcached\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_2_le1ode/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_2_le1ode/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_2_le1ode/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_2_le1ode/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmemcached: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:33.091771 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmemcached\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_kyzgbys0/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_kyzgbys0/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_kyzgbys0/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_kyzgbys0/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmemcached: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:33.091779 | orchestrator | 2026-04-07 00:46:33.091786 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:33.091794 | orchestrator | testbed-node-0 : ok=7  changed=4  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:46:33.091803 | orchestrator | testbed-node-1 : ok=7  changed=4  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:46:33.091809 | orchestrator | testbed-node-2 : ok=7  changed=4  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:46:33.091816 | orchestrator | 2026-04-07 00:46:33.091823 | orchestrator | 2026-04-07 00:46:33.091829 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:33.091836 | orchestrator | Tuesday 07 April 2026 00:46:30 +0000 (0:00:02.281) 0:00:15.377 ********* 2026-04-07 00:46:33.091843 | orchestrator | =============================================================================== 2026-04-07 00:46:33.091849 | orchestrator | memcached : Copying over config.json files for services ----------------- 2.69s 2026-04-07 00:46:33.091856 | orchestrator | memcached : Restart memcached container --------------------------------- 2.28s 2026-04-07 00:46:33.091862 | orchestrator | service-check-containers : Include tasks -------------------------------- 2.28s 2026-04-07 00:46:33.091873 | orchestrator | memcached : Ensuring config directories exist --------------------------- 2.20s 2026-04-07 00:46:33.091880 | orchestrator | service-check-containers : memcached | Check containers ----------------- 1.64s 2026-04-07 00:46:33.091887 | orchestrator | memcached : include_tasks ----------------------------------------------- 1.15s 2026-04-07 00:46:33.091893 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.15s 2026-04-07 00:46:33.091904 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.45s 2026-04-07 00:46:33.091911 | orchestrator | service-check-containers : memcached | Notify handlers to restart containers --- 0.32s 2026-04-07 00:46:33.094262 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:33.097300 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:33.098826 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:33.099736 | orchestrator | 2026-04-07 00:46:33 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:33.099955 | orchestrator | 2026-04-07 00:46:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:36.133933 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:36.136308 | orchestrator | 2026-04-07 00:46:36.136412 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task ed79e852-9ada-45a6-aadb-9ab5a1c0a97e is in state SUCCESS 2026-04-07 00:46:36.139697 | orchestrator | 2026-04-07 00:46:36.139775 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:46:36.139784 | orchestrator | 2026-04-07 00:46:36.139792 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:46:36.139800 | orchestrator | Tuesday 07 April 2026 00:46:14 +0000 (0:00:00.561) 0:00:00.561 ********* 2026-04-07 00:46:36.139806 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:36.139814 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:36.139821 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:36.139827 | orchestrator | 2026-04-07 00:46:36.139834 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:46:36.139840 | orchestrator | Tuesday 07 April 2026 00:46:15 +0000 (0:00:00.607) 0:00:01.169 ********* 2026-04-07 00:46:36.139848 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2026-04-07 00:46:36.139860 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2026-04-07 00:46:36.139870 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2026-04-07 00:46:36.139879 | orchestrator | 2026-04-07 00:46:36.139888 | orchestrator | PLAY [Apply role redis] ******************************************************** 2026-04-07 00:46:36.139898 | orchestrator | 2026-04-07 00:46:36.139909 | orchestrator | TASK [redis : include_tasks] *************************************************** 2026-04-07 00:46:36.139920 | orchestrator | Tuesday 07 April 2026 00:46:15 +0000 (0:00:00.438) 0:00:01.607 ********* 2026-04-07 00:46:36.139927 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:46:36.139948 | orchestrator | 2026-04-07 00:46:36.139955 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2026-04-07 00:46:36.139961 | orchestrator | Tuesday 07 April 2026 00:46:16 +0000 (0:00:01.248) 0:00:02.856 ********* 2026-04-07 00:46:36.139970 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140004 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140011 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140031 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140052 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140060 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140066 | orchestrator | 2026-04-07 00:46:36.140073 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2026-04-07 00:46:36.140079 | orchestrator | Tuesday 07 April 2026 00:46:19 +0000 (0:00:02.607) 0:00:05.463 ********* 2026-04-07 00:46:36.140086 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140099 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140106 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140115 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140129 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140140 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140156 | orchestrator | 2026-04-07 00:46:36.140167 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2026-04-07 00:46:36.140176 | orchestrator | Tuesday 07 April 2026 00:46:22 +0000 (0:00:03.377) 0:00:08.841 ********* 2026-04-07 00:46:36.140186 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140204 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140214 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140229 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140247 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140258 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140268 | orchestrator | 2026-04-07 00:46:36.140279 | orchestrator | TASK [service-check-containers : redis | Check containers] ********************* 2026-04-07 00:46:36.140290 | orchestrator | Tuesday 07 April 2026 00:46:26 +0000 (0:00:03.637) 0:00:12.478 ********* 2026-04-07 00:46:36.140314 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140328 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140336 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140348 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140356 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140392 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-07 00:46:36.140400 | orchestrator | 2026-04-07 00:46:36.140408 | orchestrator | TASK [service-check-containers : redis | Notify handlers to restart containers] *** 2026-04-07 00:46:36.140421 | orchestrator | Tuesday 07 April 2026 00:46:29 +0000 (0:00:02.571) 0:00:15.049 ********* 2026-04-07 00:46:36.140428 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:46:36.140435 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:36.140443 | orchestrator | } 2026-04-07 00:46:36.140450 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:46:36.140457 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:36.140464 | orchestrator | } 2026-04-07 00:46:36.140471 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:46:36.140479 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:36.140485 | orchestrator | } 2026-04-07 00:46:36.140496 | orchestrator | 2026-04-07 00:46:36.140507 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:46:36.140517 | orchestrator | Tuesday 07 April 2026 00:46:30 +0000 (0:00:01.501) 0:00:16.550 ********* 2026-04-07 00:46:36.140528 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-04-07 00:46:36.140540 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-04-07 00:46:36.140551 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:36.140562 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-04-07 00:46:36.140578 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-04-07 00:46:36.140594 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-04-07 00:46:36.140614 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:36.140626 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-04-07 00:46:36.140636 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:36.140647 | orchestrator | 2026-04-07 00:46:36.140655 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-04-07 00:46:36.140665 | orchestrator | Tuesday 07 April 2026 00:46:31 +0000 (0:00:01.308) 0:00:17.859 ********* 2026-04-07 00:46:36.140675 | orchestrator | 2026-04-07 00:46:36.140685 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-04-07 00:46:36.140696 | orchestrator | Tuesday 07 April 2026 00:46:32 +0000 (0:00:00.264) 0:00:18.124 ********* 2026-04-07 00:46:36.140705 | orchestrator | 2026-04-07 00:46:36.140715 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-04-07 00:46:36.140725 | orchestrator | Tuesday 07 April 2026 00:46:32 +0000 (0:00:00.297) 0:00:18.422 ********* 2026-04-07 00:46:36.140735 | orchestrator | 2026-04-07 00:46:36.140745 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2026-04-07 00:46:36.140756 | orchestrator | Tuesday 07 April 2026 00:46:32 +0000 (0:00:00.262) 0:00:18.684 ********* 2026-04-07 00:46:36.140791 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fredis\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_n8zh1erv/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_n8zh1erv/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_n8zh1erv/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_n8zh1erv/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fredis: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:36.140815 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fredis\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload__x3heiab/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload__x3heiab/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload__x3heiab/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload__x3heiab/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fredis: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:36.140841 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fredis\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_48518xe2/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_48518xe2/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_48518xe2/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_48518xe2/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fredis: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:36.140859 | orchestrator | 2026-04-07 00:46:36.140870 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:36.140882 | orchestrator | testbed-node-0 : ok=8  changed=5  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:46:36.140894 | orchestrator | testbed-node-1 : ok=8  changed=5  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:46:36.140905 | orchestrator | testbed-node-2 : ok=8  changed=5  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:46:36.140915 | orchestrator | 2026-04-07 00:46:36.140924 | orchestrator | 2026-04-07 00:46:36.140933 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:36.140944 | orchestrator | Tuesday 07 April 2026 00:46:34 +0000 (0:00:01.993) 0:00:20.677 ********* 2026-04-07 00:46:36.140955 | orchestrator | =============================================================================== 2026-04-07 00:46:36.140965 | orchestrator | redis : Copying over redis config files --------------------------------- 3.64s 2026-04-07 00:46:36.140971 | orchestrator | redis : Copying over default config.json files -------------------------- 3.38s 2026-04-07 00:46:36.140978 | orchestrator | redis : Ensuring config directories exist ------------------------------- 2.61s 2026-04-07 00:46:36.140984 | orchestrator | service-check-containers : redis | Check containers --------------------- 2.57s 2026-04-07 00:46:36.140990 | orchestrator | redis : Restart redis container ----------------------------------------- 1.99s 2026-04-07 00:46:36.140996 | orchestrator | service-check-containers : redis | Notify handlers to restart containers --- 1.50s 2026-04-07 00:46:36.141003 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.31s 2026-04-07 00:46:36.141009 | orchestrator | redis : include_tasks --------------------------------------------------- 1.25s 2026-04-07 00:46:36.141015 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.82s 2026-04-07 00:46:36.141021 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.61s 2026-04-07 00:46:36.141028 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.44s 2026-04-07 00:46:36.141034 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:36.141136 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task b8b756cc-d573-47a0-baf3-05cc44364a08 is in state SUCCESS 2026-04-07 00:46:36.143692 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:36.145944 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:36.147758 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:36.149590 | orchestrator | 2026-04-07 00:46:36 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:36.150521 | orchestrator | 2026-04-07 00:46:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:39.185061 | orchestrator | 2026-04-07 00:46:39 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:39.185631 | orchestrator | 2026-04-07 00:46:39 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:39.186877 | orchestrator | 2026-04-07 00:46:39 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:39.187998 | orchestrator | 2026-04-07 00:46:39 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:39.189826 | orchestrator | 2026-04-07 00:46:39 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:39.190982 | orchestrator | 2026-04-07 00:46:39 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:39.191309 | orchestrator | 2026-04-07 00:46:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:42.225875 | orchestrator | 2026-04-07 00:46:42 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:42.228206 | orchestrator | 2026-04-07 00:46:42 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state STARTED 2026-04-07 00:46:42.228259 | orchestrator | 2026-04-07 00:46:42 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:42.230224 | orchestrator | 2026-04-07 00:46:42 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:42.234863 | orchestrator | 2026-04-07 00:46:42 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state STARTED 2026-04-07 00:46:42.236645 | orchestrator | 2026-04-07 00:46:42 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:42.236695 | orchestrator | 2026-04-07 00:46:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:45.306750 | orchestrator | 2026-04-07 00:46:45 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:45.306837 | orchestrator | 2026-04-07 00:46:45 | INFO  | Task cfb58b50-a449-4ae6-9634-c6a7274f9754 is in state SUCCESS 2026-04-07 00:46:45.306849 | orchestrator | 2026-04-07 00:46:45 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:45.306857 | orchestrator | 2026-04-07 00:46:45 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:45.306862 | orchestrator | 2026-04-07 00:46:45 | INFO  | Task 28cc1627-00ab-4cb1-8dc0-b01180acb5c9 is in state SUCCESS 2026-04-07 00:46:45.307651 | orchestrator | 2026-04-07 00:46:45.307724 | orchestrator | 2026-04-07 00:46:45.307736 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2026-04-07 00:46:45.307742 | orchestrator | 2026-04-07 00:46:45.307771 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2026-04-07 00:46:45.307780 | orchestrator | Tuesday 07 April 2026 00:45:24 +0000 (0:00:00.253) 0:00:00.253 ********* 2026-04-07 00:46:45.307788 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.307797 | orchestrator | 2026-04-07 00:46:45.307805 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2026-04-07 00:46:45.307813 | orchestrator | Tuesday 07 April 2026 00:45:27 +0000 (0:00:02.463) 0:00:02.717 ********* 2026-04-07 00:46:45.307821 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2026-04-07 00:46:45.307874 | orchestrator | 2026-04-07 00:46:45.307885 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2026-04-07 00:46:45.307901 | orchestrator | Tuesday 07 April 2026 00:45:27 +0000 (0:00:00.576) 0:00:03.293 ********* 2026-04-07 00:46:45.307909 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.307916 | orchestrator | 2026-04-07 00:46:45.307923 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2026-04-07 00:46:45.307931 | orchestrator | Tuesday 07 April 2026 00:45:29 +0000 (0:00:01.623) 0:00:04.916 ********* 2026-04-07 00:46:45.307939 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2026-04-07 00:46:45.307947 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.307955 | orchestrator | 2026-04-07 00:46:45.307962 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2026-04-07 00:46:45.307970 | orchestrator | Tuesday 07 April 2026 00:46:29 +0000 (0:00:59.780) 0:01:04.696 ********* 2026-04-07 00:46:45.307977 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.307983 | orchestrator | 2026-04-07 00:46:45.307988 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:45.307992 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.307998 | orchestrator | 2026-04-07 00:46:45.308002 | orchestrator | 2026-04-07 00:46:45.308010 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:45.308017 | orchestrator | Tuesday 07 April 2026 00:46:34 +0000 (0:00:05.289) 0:01:09.986 ********* 2026-04-07 00:46:45.308066 | orchestrator | =============================================================================== 2026-04-07 00:46:45.308076 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 59.78s 2026-04-07 00:46:45.308084 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 5.29s 2026-04-07 00:46:45.308092 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 2.46s 2026-04-07 00:46:45.308099 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 1.62s 2026-04-07 00:46:45.308107 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 0.58s 2026-04-07 00:46:45.308114 | orchestrator | 2026-04-07 00:46:45.308121 | orchestrator | 2026-04-07 00:46:45.308129 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:46:45.308137 | orchestrator | 2026-04-07 00:46:45.308145 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:46:45.308153 | orchestrator | Tuesday 07 April 2026 00:45:09 +0000 (0:00:00.830) 0:00:00.831 ********* 2026-04-07 00:46:45.308190 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2026-04-07 00:46:45.308197 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2026-04-07 00:46:45.308205 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2026-04-07 00:46:45.308212 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2026-04-07 00:46:45.308219 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2026-04-07 00:46:45.308227 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2026-04-07 00:46:45.308235 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2026-04-07 00:46:45.308242 | orchestrator | 2026-04-07 00:46:45.308250 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2026-04-07 00:46:45.308258 | orchestrator | 2026-04-07 00:46:45.308266 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2026-04-07 00:46:45.308274 | orchestrator | Tuesday 07 April 2026 00:45:10 +0000 (0:00:01.731) 0:00:02.563 ********* 2026-04-07 00:46:45.308282 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-5, testbed-node-4 2026-04-07 00:46:45.308298 | orchestrator | 2026-04-07 00:46:45.308306 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2026-04-07 00:46:45.308314 | orchestrator | Tuesday 07 April 2026 00:45:11 +0000 (0:00:01.076) 0:00:03.639 ********* 2026-04-07 00:46:45.308322 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.308330 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:45.308338 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:45.308345 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:45.308353 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:46:45.308391 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:46:45.308398 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:46:45.308406 | orchestrator | 2026-04-07 00:46:45.308414 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2026-04-07 00:46:45.308421 | orchestrator | Tuesday 07 April 2026 00:45:15 +0000 (0:00:04.022) 0:00:07.661 ********* 2026-04-07 00:46:45.308429 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:46:45.308437 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:46:45.308444 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.308451 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:46:45.308458 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:45.308484 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:45.308492 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:45.308500 | orchestrator | 2026-04-07 00:46:45.308518 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2026-04-07 00:46:45.308527 | orchestrator | Tuesday 07 April 2026 00:45:18 +0000 (0:00:02.598) 0:00:10.260 ********* 2026-04-07 00:46:45.308535 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:45.308543 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.308551 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:45.308559 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:45.308567 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:45.308574 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:45.308582 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:45.308589 | orchestrator | 2026-04-07 00:46:45.308597 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2026-04-07 00:46:45.308605 | orchestrator | Tuesday 07 April 2026 00:45:20 +0000 (0:00:02.083) 0:00:12.343 ********* 2026-04-07 00:46:45.308613 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:45.308621 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:45.308628 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:45.308636 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:45.308644 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:45.308651 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:45.308659 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.308666 | orchestrator | 2026-04-07 00:46:45.308673 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2026-04-07 00:46:45.308680 | orchestrator | Tuesday 07 April 2026 00:45:31 +0000 (0:00:10.401) 0:00:22.744 ********* 2026-04-07 00:46:45.308688 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:45.308696 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:45.308703 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:45.308711 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:45.308718 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:45.308726 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:45.308734 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.308741 | orchestrator | 2026-04-07 00:46:45.308749 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2026-04-07 00:46:45.308756 | orchestrator | Tuesday 07 April 2026 00:46:16 +0000 (0:00:45.879) 0:01:08.624 ********* 2026-04-07 00:46:45.308768 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:46:45.308777 | orchestrator | 2026-04-07 00:46:45.308785 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2026-04-07 00:46:45.308797 | orchestrator | Tuesday 07 April 2026 00:46:18 +0000 (0:00:01.106) 0:01:09.730 ********* 2026-04-07 00:46:45.308805 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2026-04-07 00:46:45.308812 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2026-04-07 00:46:45.308820 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2026-04-07 00:46:45.308827 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2026-04-07 00:46:45.308835 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2026-04-07 00:46:45.308843 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2026-04-07 00:46:45.308850 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2026-04-07 00:46:45.308858 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2026-04-07 00:46:45.308866 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2026-04-07 00:46:45.308873 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2026-04-07 00:46:45.308881 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2026-04-07 00:46:45.308888 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2026-04-07 00:46:45.308895 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2026-04-07 00:46:45.308903 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2026-04-07 00:46:45.308910 | orchestrator | 2026-04-07 00:46:45.308918 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2026-04-07 00:46:45.308926 | orchestrator | Tuesday 07 April 2026 00:46:21 +0000 (0:00:03.819) 0:01:13.550 ********* 2026-04-07 00:46:45.308934 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.308942 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:45.308949 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:45.308957 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:45.308965 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:46:45.308973 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:46:45.308980 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:46:45.308988 | orchestrator | 2026-04-07 00:46:45.308995 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2026-04-07 00:46:45.309003 | orchestrator | Tuesday 07 April 2026 00:46:23 +0000 (0:00:01.153) 0:01:14.704 ********* 2026-04-07 00:46:45.309011 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.309018 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:45.309026 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:45.309033 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:45.309041 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:45.309049 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:45.309057 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:45.309065 | orchestrator | 2026-04-07 00:46:45.309073 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2026-04-07 00:46:45.309080 | orchestrator | Tuesday 07 April 2026 00:46:24 +0000 (0:00:01.240) 0:01:15.944 ********* 2026-04-07 00:46:45.309088 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.309095 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:46:45.309103 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:45.309110 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:45.309118 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:46:45.309126 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:46:45.309134 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:45.309141 | orchestrator | 2026-04-07 00:46:45.309149 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2026-04-07 00:46:45.309157 | orchestrator | Tuesday 07 April 2026 00:46:25 +0000 (0:00:01.550) 0:01:17.495 ********* 2026-04-07 00:46:45.309165 | orchestrator | ok: [testbed-manager] 2026-04-07 00:46:45.309172 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:45.309185 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:45.309193 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:46:45.309200 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:46:45.309208 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:46:45.309222 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:45.309230 | orchestrator | 2026-04-07 00:46:45.309238 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2026-04-07 00:46:45.309246 | orchestrator | Tuesday 07 April 2026 00:46:28 +0000 (0:00:02.428) 0:01:19.924 ********* 2026-04-07 00:46:45.309253 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2026-04-07 00:46:45.309263 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:46:45.309271 | orchestrator | 2026-04-07 00:46:45.309278 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2026-04-07 00:46:45.309286 | orchestrator | Tuesday 07 April 2026 00:46:29 +0000 (0:00:01.594) 0:01:21.519 ********* 2026-04-07 00:46:45.309293 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.309300 | orchestrator | 2026-04-07 00:46:45.309308 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2026-04-07 00:46:45.309315 | orchestrator | Tuesday 07 April 2026 00:46:31 +0000 (0:00:02.123) 0:01:23.642 ********* 2026-04-07 00:46:45.309323 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:46:45.309331 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:46:45.309339 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:46:45.309346 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:46:45.309354 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:46:45.309444 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:46:45.309452 | orchestrator | changed: [testbed-manager] 2026-04-07 00:46:45.309460 | orchestrator | 2026-04-07 00:46:45.309467 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:45.309475 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309487 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309495 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309502 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309510 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309517 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309525 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:46:45.309532 | orchestrator | 2026-04-07 00:46:45.309539 | orchestrator | 2026-04-07 00:46:45.309546 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:45.309554 | orchestrator | Tuesday 07 April 2026 00:46:43 +0000 (0:00:11.158) 0:01:34.801 ********* 2026-04-07 00:46:45.309562 | orchestrator | =============================================================================== 2026-04-07 00:46:45.309569 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 45.88s 2026-04-07 00:46:45.309577 | orchestrator | osism.services.netdata : Restart service netdata ----------------------- 11.16s 2026-04-07 00:46:45.309584 | orchestrator | osism.services.netdata : Add repository -------------------------------- 10.40s 2026-04-07 00:46:45.309591 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 4.02s 2026-04-07 00:46:45.309598 | orchestrator | osism.services.netdata : Copy configuration files ----------------------- 3.82s 2026-04-07 00:46:45.309613 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 2.60s 2026-04-07 00:46:45.309620 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 2.43s 2026-04-07 00:46:45.309628 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 2.12s 2026-04-07 00:46:45.309635 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 2.08s 2026-04-07 00:46:45.309643 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.73s 2026-04-07 00:46:45.309650 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 1.60s 2026-04-07 00:46:45.309657 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 1.55s 2026-04-07 00:46:45.309664 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 1.24s 2026-04-07 00:46:45.309672 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 1.15s 2026-04-07 00:46:45.309679 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 1.11s 2026-04-07 00:46:45.309686 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 1.08s 2026-04-07 00:46:45.309693 | orchestrator | 2026-04-07 00:46:45.309701 | orchestrator | 2026-04-07 00:46:45.309709 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:46:45.309716 | orchestrator | 2026-04-07 00:46:45.309729 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:46:45.309736 | orchestrator | Tuesday 07 April 2026 00:46:14 +0000 (0:00:00.326) 0:00:00.326 ********* 2026-04-07 00:46:45.309743 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:46:45.309751 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:46:45.309758 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:46:45.309765 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:46:45.309773 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:46:45.309780 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:46:45.309788 | orchestrator | 2026-04-07 00:46:45.309795 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:46:45.309803 | orchestrator | Tuesday 07 April 2026 00:46:15 +0000 (0:00:01.311) 0:00:01.638 ********* 2026-04-07 00:46:45.309810 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-07 00:46:45.309818 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-07 00:46:45.309825 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-07 00:46:45.309833 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-07 00:46:45.309839 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-07 00:46:45.309846 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-07 00:46:45.309854 | orchestrator | 2026-04-07 00:46:45.309862 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2026-04-07 00:46:45.309870 | orchestrator | 2026-04-07 00:46:45.309877 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2026-04-07 00:46:45.309884 | orchestrator | Tuesday 07 April 2026 00:46:17 +0000 (0:00:02.118) 0:00:03.756 ********* 2026-04-07 00:46:45.309892 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:46:45.309900 | orchestrator | 2026-04-07 00:46:45.309907 | orchestrator | TASK [module-load : Load modules] ********************************************** 2026-04-07 00:46:45.309915 | orchestrator | Tuesday 07 April 2026 00:46:20 +0000 (0:00:02.736) 0:00:06.492 ********* 2026-04-07 00:46:45.309925 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2026-04-07 00:46:45.309932 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2026-04-07 00:46:45.309940 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2026-04-07 00:46:45.309950 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2026-04-07 00:46:45.309963 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2026-04-07 00:46:45.309970 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2026-04-07 00:46:45.309976 | orchestrator | 2026-04-07 00:46:45.309983 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2026-04-07 00:46:45.309990 | orchestrator | Tuesday 07 April 2026 00:46:22 +0000 (0:00:02.150) 0:00:08.643 ********* 2026-04-07 00:46:45.309997 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2026-04-07 00:46:45.310004 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2026-04-07 00:46:45.310048 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2026-04-07 00:46:45.310059 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2026-04-07 00:46:45.310066 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2026-04-07 00:46:45.310073 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2026-04-07 00:46:45.310080 | orchestrator | 2026-04-07 00:46:45.310087 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2026-04-07 00:46:45.310094 | orchestrator | Tuesday 07 April 2026 00:46:25 +0000 (0:00:02.805) 0:00:11.449 ********* 2026-04-07 00:46:45.310101 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2026-04-07 00:46:45.310108 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:45.310115 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2026-04-07 00:46:45.310122 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:45.310129 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2026-04-07 00:46:45.310136 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2026-04-07 00:46:45.310143 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:45.310150 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2026-04-07 00:46:45.310157 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:45.310164 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:45.310171 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2026-04-07 00:46:45.310178 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:45.310185 | orchestrator | 2026-04-07 00:46:45.310192 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2026-04-07 00:46:45.310199 | orchestrator | Tuesday 07 April 2026 00:46:27 +0000 (0:00:01.627) 0:00:13.076 ********* 2026-04-07 00:46:45.310206 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:45.310213 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:45.310220 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:45.310227 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:45.310234 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:45.310241 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:45.310248 | orchestrator | 2026-04-07 00:46:45.310254 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2026-04-07 00:46:45.310261 | orchestrator | Tuesday 07 April 2026 00:46:27 +0000 (0:00:00.676) 0:00:13.752 ********* 2026-04-07 00:46:45.310283 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310294 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310310 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310319 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310327 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310335 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310346 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310369 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310382 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310390 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310398 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310409 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310417 | orchestrator | 2026-04-07 00:46:45.310424 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2026-04-07 00:46:45.310431 | orchestrator | Tuesday 07 April 2026 00:46:31 +0000 (0:00:03.447) 0:00:17.200 ********* 2026-04-07 00:46:45.310446 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310456 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310464 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310471 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310478 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310490 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310503 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310514 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310521 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310529 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310536 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310551 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310560 | orchestrator | 2026-04-07 00:46:45.310567 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2026-04-07 00:46:45.310574 | orchestrator | Tuesday 07 April 2026 00:46:35 +0000 (0:00:04.081) 0:00:21.281 ********* 2026-04-07 00:46:45.310582 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:45.310589 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:45.310596 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:45.310603 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:45.310610 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:45.310617 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:45.310623 | orchestrator | 2026-04-07 00:46:45.310630 | orchestrator | TASK [service-check-containers : openvswitch | Check containers] *************** 2026-04-07 00:46:45.310637 | orchestrator | Tuesday 07 April 2026 00:46:36 +0000 (0:00:00.611) 0:00:21.892 ********* 2026-04-07 00:46:45.310647 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310655 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310663 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310674 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310687 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310694 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310705 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310712 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310719 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310736 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310744 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310754 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-07 00:46:45.310761 | orchestrator | 2026-04-07 00:46:45.310769 | orchestrator | TASK [service-check-containers : openvswitch | Notify handlers to restart containers] *** 2026-04-07 00:46:45.310776 | orchestrator | Tuesday 07 April 2026 00:46:38 +0000 (0:00:02.566) 0:00:24.459 ********* 2026-04-07 00:46:45.310783 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:46:45.310790 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:45.310798 | orchestrator | } 2026-04-07 00:46:45.310805 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:46:45.310812 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:45.310820 | orchestrator | } 2026-04-07 00:46:45.310827 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:46:45.310834 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:45.310842 | orchestrator | } 2026-04-07 00:46:45.310846 | orchestrator | changed: [testbed-node-3] => { 2026-04-07 00:46:45.310851 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:45.310855 | orchestrator | } 2026-04-07 00:46:45.310859 | orchestrator | changed: [testbed-node-4] => { 2026-04-07 00:46:45.310863 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:45.310868 | orchestrator | } 2026-04-07 00:46:45.310872 | orchestrator | changed: [testbed-node-5] => { 2026-04-07 00:46:45.310876 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:46:45.310880 | orchestrator | } 2026-04-07 00:46:45.310885 | orchestrator | 2026-04-07 00:46:45.310889 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:46:45.310894 | orchestrator | Tuesday 07 April 2026 00:46:39 +0000 (0:00:00.469) 0:00:24.929 ********* 2026-04-07 00:46:45.310901 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-07 00:46:45.310906 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-07 00:46:45.310914 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-07 00:46:45.310919 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-07 00:46:45.310923 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:46:45.310928 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:46:45.310932 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-07 00:46:45.310937 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-07 00:46:45.311266 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-07 00:46:45.311295 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-07 00:46:45.311303 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:46:45.311310 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-07 00:46:45.311320 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-07 00:46:45.311329 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:46:45.311336 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:46:45.311343 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release//openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-07 00:46:45.311369 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release//openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-07 00:46:45.311378 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:46:45.311382 | orchestrator | 2026-04-07 00:46:45.311387 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-07 00:46:45.311391 | orchestrator | Tuesday 07 April 2026 00:46:40 +0000 (0:00:01.480) 0:00:26.409 ********* 2026-04-07 00:46:45.311395 | orchestrator | 2026-04-07 00:46:45.311400 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-07 00:46:45.311404 | orchestrator | Tuesday 07 April 2026 00:46:40 +0000 (0:00:00.141) 0:00:26.550 ********* 2026-04-07 00:46:45.311408 | orchestrator | 2026-04-07 00:46:45.311412 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-07 00:46:45.311417 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.241) 0:00:26.792 ********* 2026-04-07 00:46:45.311421 | orchestrator | 2026-04-07 00:46:45.311425 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-07 00:46:45.311429 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.218) 0:00:27.011 ********* 2026-04-07 00:46:45.311434 | orchestrator | 2026-04-07 00:46:45.311438 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-07 00:46:45.311445 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.325) 0:00:27.336 ********* 2026-04-07 00:46:45.311450 | orchestrator | 2026-04-07 00:46:45.311454 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-07 00:46:45.311458 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.129) 0:00:27.466 ********* 2026-04-07 00:46:45.311463 | orchestrator | 2026-04-07 00:46:45.311467 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2026-04-07 00:46:45.311471 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.121) 0:00:27.588 ********* 2026-04-07 00:46:45.311480 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_kql3pu6q/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_kql3pu6q/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_kql3pu6q/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_kql3pu6q/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:45.311493 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_kashpy0b/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_kashpy0b/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_kashpy0b/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_kashpy0b/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:45.311502 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_0ti7mu33/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_0ti7mu33/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_0ti7mu33/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_0ti7mu33/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:45.311517 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_gubezzd4/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_gubezzd4/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_gubezzd4/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_gubezzd4/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:45.311534 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_8ji2iu1c/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_8ji2iu1c/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_8ji2iu1c/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_8ji2iu1c/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:45.311545 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload__2ld5bzg/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload__2ld5bzg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload__2ld5bzg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload__2ld5bzg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopenvswitch-db-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:46:45.311556 | orchestrator | 2026-04-07 00:46:45.311563 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:46:45.311570 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:46:45.311578 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:46:45.311585 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:46:45.311592 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:46:45.311599 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:46:45.311606 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:46:45.311613 | orchestrator | 2026-04-07 00:46:45.311621 | orchestrator | 2026-04-07 00:46:45.311628 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:46:45.311635 | orchestrator | Tuesday 07 April 2026 00:46:44 +0000 (0:00:02.995) 0:00:30.583 ********* 2026-04-07 00:46:45.311645 | orchestrator | =============================================================================== 2026-04-07 00:46:45.311652 | orchestrator | openvswitch : Copying over config.json files for services --------------- 4.08s 2026-04-07 00:46:45.311660 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 3.45s 2026-04-07 00:46:45.311667 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------- 3.00s 2026-04-07 00:46:45.311674 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 2.81s 2026-04-07 00:46:45.311681 | orchestrator | openvswitch : include_tasks --------------------------------------------- 2.74s 2026-04-07 00:46:45.311687 | orchestrator | service-check-containers : openvswitch | Check containers --------------- 2.57s 2026-04-07 00:46:45.311694 | orchestrator | module-load : Load modules ---------------------------------------------- 2.15s 2026-04-07 00:46:45.311701 | orchestrator | Group hosts based on enabled services ----------------------------------- 2.12s 2026-04-07 00:46:45.311713 | orchestrator | module-load : Drop module persistence ----------------------------------- 1.63s 2026-04-07 00:46:45.311720 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.48s 2026-04-07 00:46:45.311727 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.31s 2026-04-07 00:46:45.311733 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 1.18s 2026-04-07 00:46:45.311740 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 0.68s 2026-04-07 00:46:45.311747 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 0.61s 2026-04-07 00:46:45.311755 | orchestrator | service-check-containers : openvswitch | Notify handlers to restart containers --- 0.47s 2026-04-07 00:46:45.311762 | orchestrator | 2026-04-07 00:46:45 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:45.311769 | orchestrator | 2026-04-07 00:46:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:48.346847 | orchestrator | 2026-04-07 00:46:48 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:48.348182 | orchestrator | 2026-04-07 00:46:48 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:48.349501 | orchestrator | 2026-04-07 00:46:48 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:48.350740 | orchestrator | 2026-04-07 00:46:48 | INFO  | Task 2d7e33ab-1b8a-47fc-b25b-add1bff369b6 is in state STARTED 2026-04-07 00:46:48.351915 | orchestrator | 2026-04-07 00:46:48 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:48.351937 | orchestrator | 2026-04-07 00:46:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:51.384710 | orchestrator | 2026-04-07 00:46:51 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:51.393602 | orchestrator | 2026-04-07 00:46:51 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:51.397336 | orchestrator | 2026-04-07 00:46:51 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:51.397781 | orchestrator | 2026-04-07 00:46:51 | INFO  | Task 2d7e33ab-1b8a-47fc-b25b-add1bff369b6 is in state STARTED 2026-04-07 00:46:51.400791 | orchestrator | 2026-04-07 00:46:51 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:51.400828 | orchestrator | 2026-04-07 00:46:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:54.432948 | orchestrator | 2026-04-07 00:46:54 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:54.433848 | orchestrator | 2026-04-07 00:46:54 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:54.434416 | orchestrator | 2026-04-07 00:46:54 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:54.435473 | orchestrator | 2026-04-07 00:46:54 | INFO  | Task 2d7e33ab-1b8a-47fc-b25b-add1bff369b6 is in state STARTED 2026-04-07 00:46:54.437170 | orchestrator | 2026-04-07 00:46:54 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:54.437245 | orchestrator | 2026-04-07 00:46:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:46:57.468442 | orchestrator | 2026-04-07 00:46:57 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:46:57.468540 | orchestrator | 2026-04-07 00:46:57 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:46:57.469674 | orchestrator | 2026-04-07 00:46:57 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:46:57.470281 | orchestrator | 2026-04-07 00:46:57 | INFO  | Task 2d7e33ab-1b8a-47fc-b25b-add1bff369b6 is in state STARTED 2026-04-07 00:46:57.471315 | orchestrator | 2026-04-07 00:46:57 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:46:57.471388 | orchestrator | 2026-04-07 00:46:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:00.512237 | orchestrator | 2026-04-07 00:47:00 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:00.516337 | orchestrator | 2026-04-07 00:47:00 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:00.516428 | orchestrator | 2026-04-07 00:47:00 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:47:00.516438 | orchestrator | 2026-04-07 00:47:00 | INFO  | Task 2d7e33ab-1b8a-47fc-b25b-add1bff369b6 is in state STARTED 2026-04-07 00:47:00.516446 | orchestrator | 2026-04-07 00:47:00 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:00.516452 | orchestrator | 2026-04-07 00:47:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:03.554251 | orchestrator | 2026-04-07 00:47:03 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:03.554950 | orchestrator | 2026-04-07 00:47:03 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:03.555974 | orchestrator | 2026-04-07 00:47:03 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:47:03.557016 | orchestrator | 2026-04-07 00:47:03 | INFO  | Task 2d7e33ab-1b8a-47fc-b25b-add1bff369b6 is in state SUCCESS 2026-04-07 00:47:03.558052 | orchestrator | 2026-04-07 00:47:03.558090 | orchestrator | 2026-04-07 00:47:03.558098 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:47:03.558105 | orchestrator | 2026-04-07 00:47:03.558112 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:47:03.558127 | orchestrator | Tuesday 07 April 2026 00:46:48 +0000 (0:00:00.249) 0:00:00.249 ********* 2026-04-07 00:47:03.558134 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:47:03.558141 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:47:03.558148 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:47:03.558154 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:47:03.558160 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:47:03.558166 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:47:03.558173 | orchestrator | 2026-04-07 00:47:03.558179 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:47:03.558185 | orchestrator | Tuesday 07 April 2026 00:46:49 +0000 (0:00:00.510) 0:00:00.760 ********* 2026-04-07 00:47:03.558191 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2026-04-07 00:47:03.558198 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2026-04-07 00:47:03.558204 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2026-04-07 00:47:03.558211 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2026-04-07 00:47:03.558217 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2026-04-07 00:47:03.558223 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2026-04-07 00:47:03.558230 | orchestrator | 2026-04-07 00:47:03.558236 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2026-04-07 00:47:03.558242 | orchestrator | 2026-04-07 00:47:03.558249 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2026-04-07 00:47:03.558255 | orchestrator | Tuesday 07 April 2026 00:46:49 +0000 (0:00:00.887) 0:00:01.648 ********* 2026-04-07 00:47:03.558262 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:47:03.558284 | orchestrator | 2026-04-07 00:47:03.558291 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2026-04-07 00:47:03.558298 | orchestrator | Tuesday 07 April 2026 00:46:50 +0000 (0:00:00.970) 0:00:02.619 ********* 2026-04-07 00:47:03.558306 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558314 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558321 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558327 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558334 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558349 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558392 | orchestrator | 2026-04-07 00:47:03.558399 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2026-04-07 00:47:03.558409 | orchestrator | Tuesday 07 April 2026 00:46:52 +0000 (0:00:01.709) 0:00:04.328 ********* 2026-04-07 00:47:03.558416 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558422 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558435 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558441 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558448 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558454 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558460 | orchestrator | 2026-04-07 00:47:03.558467 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2026-04-07 00:47:03.558473 | orchestrator | Tuesday 07 April 2026 00:46:54 +0000 (0:00:01.890) 0:00:06.219 ********* 2026-04-07 00:47:03.558479 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558486 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558505 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558517 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558540 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558554 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558564 | orchestrator | 2026-04-07 00:47:03.558574 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2026-04-07 00:47:03.558584 | orchestrator | Tuesday 07 April 2026 00:46:55 +0000 (0:00:01.172) 0:00:07.391 ********* 2026-04-07 00:47:03.558595 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558606 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558618 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558629 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558640 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558663 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558683 | orchestrator | 2026-04-07 00:47:03.558694 | orchestrator | TASK [service-check-containers : ovn_controller | Check containers] ************ 2026-04-07 00:47:03.558710 | orchestrator | Tuesday 07 April 2026 00:46:57 +0000 (0:00:01.501) 0:00:08.893 ********* 2026-04-07 00:47:03.558722 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558733 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558744 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558754 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558764 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558776 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 00:47:03.558788 | orchestrator | 2026-04-07 00:47:03.558799 | orchestrator | TASK [service-check-containers : ovn_controller | Notify handlers to restart containers] *** 2026-04-07 00:47:03.558810 | orchestrator | Tuesday 07 April 2026 00:46:58 +0000 (0:00:01.732) 0:00:10.625 ********* 2026-04-07 00:47:03.558821 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:47:03.558833 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:03.558845 | orchestrator | } 2026-04-07 00:47:03.558855 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:47:03.558867 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:03.558878 | orchestrator | } 2026-04-07 00:47:03.558890 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:47:03.558901 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:03.558912 | orchestrator | } 2026-04-07 00:47:03.558924 | orchestrator | changed: [testbed-node-3] => { 2026-04-07 00:47:03.558932 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:03.558946 | orchestrator | } 2026-04-07 00:47:03.558954 | orchestrator | changed: [testbed-node-4] => { 2026-04-07 00:47:03.558962 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:03.558969 | orchestrator | } 2026-04-07 00:47:03.558977 | orchestrator | changed: [testbed-node-5] => { 2026-04-07 00:47:03.558985 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:03.558991 | orchestrator | } 2026-04-07 00:47:03.558998 | orchestrator | 2026-04-07 00:47:03.559004 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:47:03.559016 | orchestrator | Tuesday 07 April 2026 00:46:59 +0000 (0:00:00.739) 0:00:11.365 ********* 2026-04-07 00:47:03.559027 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:47:03.559034 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:03.559040 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:47:03.559047 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:47:03.559053 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:47:03.559059 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:47:03.559066 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:47:03.559072 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:47:03.559079 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:47:03.559085 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:47:03.559092 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:47:03.559098 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:47:03.559104 | orchestrator | 2026-04-07 00:47:03.559111 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2026-04-07 00:47:03.559121 | orchestrator | Tuesday 07 April 2026 00:47:01 +0000 (0:00:01.451) 0:00:12.816 ********* 2026-04-07 00:47:03.559127 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:47:03.559134 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:47:03.559140 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:47:03.559147 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:47:03.559154 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:47:03.559165 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:47:03.559182 | orchestrator | 2026-04-07 00:47:03.559195 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:47:03.559212 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:47:03.559228 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:47:03.559240 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:47:03.559251 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:47:03.559263 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:47:03.559275 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-07 00:47:03.559287 | orchestrator | 2026-04-07 00:47:03.559295 | orchestrator | 2026-04-07 00:47:03.559302 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:47:03.559309 | orchestrator | Tuesday 07 April 2026 00:47:02 +0000 (0:00:01.485) 0:00:14.301 ********* 2026-04-07 00:47:03.559316 | orchestrator | =============================================================================== 2026-04-07 00:47:03.559323 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 1.89s 2026-04-07 00:47:03.559330 | orchestrator | service-check-containers : ovn_controller | Check containers ------------ 1.73s 2026-04-07 00:47:03.559337 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.71s 2026-04-07 00:47:03.559343 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 1.50s 2026-04-07 00:47:03.559350 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 1.49s 2026-04-07 00:47:03.559373 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.45s 2026-04-07 00:47:03.559380 | orchestrator | ovn-controller : Ensuring systemd override directory exists ------------- 1.17s 2026-04-07 00:47:03.559387 | orchestrator | ovn-controller : include_tasks ------------------------------------------ 0.97s 2026-04-07 00:47:03.559394 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.89s 2026-04-07 00:47:03.559401 | orchestrator | service-check-containers : ovn_controller | Notify handlers to restart containers --- 0.74s 2026-04-07 00:47:03.559407 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.51s 2026-04-07 00:47:03.559414 | orchestrator | 2026-04-07 00:47:03 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:03.559428 | orchestrator | 2026-04-07 00:47:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:06.592504 | orchestrator | 2026-04-07 00:47:06 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:06.595332 | orchestrator | 2026-04-07 00:47:06 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:06.596336 | orchestrator | 2026-04-07 00:47:06 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state STARTED 2026-04-07 00:47:06.598761 | orchestrator | 2026-04-07 00:47:06 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:06.598944 | orchestrator | 2026-04-07 00:47:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:09.643199 | orchestrator | 2026-04-07 00:47:09 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:09.643281 | orchestrator | 2026-04-07 00:47:09 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:09.643291 | orchestrator | 2026-04-07 00:47:09 | INFO  | Task 2f00dac1-e8e0-4aca-95aa-89b07084475a is in state SUCCESS 2026-04-07 00:47:09.643298 | orchestrator | 2026-04-07 00:47:09 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:09.644325 | orchestrator | 2026-04-07 00:47:09.644390 | orchestrator | 2026-04-07 00:47:09.644400 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2026-04-07 00:47:09.644407 | orchestrator | 2026-04-07 00:47:09.644413 | orchestrator | TASK [Inform the user about the following task] ******************************** 2026-04-07 00:47:09.644418 | orchestrator | Tuesday 07 April 2026 00:46:36 +0000 (0:00:00.102) 0:00:00.102 ********* 2026-04-07 00:47:09.644424 | orchestrator | ok: [localhost] => { 2026-04-07 00:47:09.644432 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2026-04-07 00:47:09.644439 | orchestrator | } 2026-04-07 00:47:09.644446 | orchestrator | 2026-04-07 00:47:09.644454 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2026-04-07 00:47:09.644461 | orchestrator | Tuesday 07 April 2026 00:46:36 +0000 (0:00:00.034) 0:00:00.137 ********* 2026-04-07 00:47:09.644470 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2026-04-07 00:47:09.644478 | orchestrator | ...ignoring 2026-04-07 00:47:09.644485 | orchestrator | 2026-04-07 00:47:09.644491 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2026-04-07 00:47:09.644498 | orchestrator | Tuesday 07 April 2026 00:46:39 +0000 (0:00:03.224) 0:00:03.361 ********* 2026-04-07 00:47:09.644504 | orchestrator | skipping: [localhost] 2026-04-07 00:47:09.644512 | orchestrator | 2026-04-07 00:47:09.644517 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2026-04-07 00:47:09.644526 | orchestrator | Tuesday 07 April 2026 00:46:39 +0000 (0:00:00.077) 0:00:03.439 ********* 2026-04-07 00:47:09.644533 | orchestrator | ok: [localhost] 2026-04-07 00:47:09.644540 | orchestrator | 2026-04-07 00:47:09.644547 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:47:09.644554 | orchestrator | 2026-04-07 00:47:09.644561 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:47:09.644567 | orchestrator | Tuesday 07 April 2026 00:46:39 +0000 (0:00:00.232) 0:00:03.671 ********* 2026-04-07 00:47:09.644574 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:47:09.644581 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:47:09.644587 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:47:09.644594 | orchestrator | 2026-04-07 00:47:09.644600 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:47:09.644607 | orchestrator | Tuesday 07 April 2026 00:46:40 +0000 (0:00:00.469) 0:00:04.141 ********* 2026-04-07 00:47:09.644635 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2026-04-07 00:47:09.644643 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2026-04-07 00:47:09.644649 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2026-04-07 00:47:09.644656 | orchestrator | 2026-04-07 00:47:09.644663 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2026-04-07 00:47:09.644669 | orchestrator | 2026-04-07 00:47:09.644676 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-04-07 00:47:09.644683 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.683) 0:00:04.825 ********* 2026-04-07 00:47:09.644690 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:47:09.644697 | orchestrator | 2026-04-07 00:47:09.644704 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2026-04-07 00:47:09.644711 | orchestrator | Tuesday 07 April 2026 00:46:41 +0000 (0:00:00.663) 0:00:05.489 ********* 2026-04-07 00:47:09.644717 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:47:09.644724 | orchestrator | 2026-04-07 00:47:09.644731 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2026-04-07 00:47:09.644738 | orchestrator | Tuesday 07 April 2026 00:46:43 +0000 (0:00:01.616) 0:00:07.105 ********* 2026-04-07 00:47:09.644744 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.644752 | orchestrator | 2026-04-07 00:47:09.644759 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2026-04-07 00:47:09.644766 | orchestrator | Tuesday 07 April 2026 00:46:43 +0000 (0:00:00.356) 0:00:07.462 ********* 2026-04-07 00:47:09.644772 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.644779 | orchestrator | 2026-04-07 00:47:09.644785 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2026-04-07 00:47:09.644792 | orchestrator | Tuesday 07 April 2026 00:46:44 +0000 (0:00:00.566) 0:00:08.029 ********* 2026-04-07 00:47:09.644799 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.644805 | orchestrator | 2026-04-07 00:47:09.644812 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2026-04-07 00:47:09.644818 | orchestrator | Tuesday 07 April 2026 00:46:45 +0000 (0:00:00.773) 0:00:08.802 ********* 2026-04-07 00:47:09.644825 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.644832 | orchestrator | 2026-04-07 00:47:09.644840 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-04-07 00:47:09.644912 | orchestrator | Tuesday 07 April 2026 00:46:45 +0000 (0:00:00.357) 0:00:09.159 ********* 2026-04-07 00:47:09.644926 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:47:09.644933 | orchestrator | 2026-04-07 00:47:09.644939 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2026-04-07 00:47:09.644946 | orchestrator | Tuesday 07 April 2026 00:46:45 +0000 (0:00:00.528) 0:00:09.688 ********* 2026-04-07 00:47:09.644952 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:47:09.644958 | orchestrator | 2026-04-07 00:47:09.644964 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2026-04-07 00:47:09.644971 | orchestrator | Tuesday 07 April 2026 00:46:46 +0000 (0:00:00.640) 0:00:10.328 ********* 2026-04-07 00:47:09.644975 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.644979 | orchestrator | 2026-04-07 00:47:09.644983 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2026-04-07 00:47:09.644987 | orchestrator | Tuesday 07 April 2026 00:46:47 +0000 (0:00:00.465) 0:00:10.794 ********* 2026-04-07 00:47:09.644994 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.645000 | orchestrator | 2026-04-07 00:47:09.645017 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2026-04-07 00:47:09.645024 | orchestrator | Tuesday 07 April 2026 00:46:47 +0000 (0:00:00.429) 0:00:11.224 ********* 2026-04-07 00:47:09.645038 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645055 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645063 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645070 | orchestrator | 2026-04-07 00:47:09.645077 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2026-04-07 00:47:09.645083 | orchestrator | Tuesday 07 April 2026 00:46:48 +0000 (0:00:01.388) 0:00:12.613 ********* 2026-04-07 00:47:09.645097 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645112 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645119 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645126 | orchestrator | 2026-04-07 00:47:09.645132 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2026-04-07 00:47:09.645139 | orchestrator | Tuesday 07 April 2026 00:46:50 +0000 (0:00:01.160) 0:00:13.773 ********* 2026-04-07 00:47:09.645145 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-04-07 00:47:09.645152 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-04-07 00:47:09.645158 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-04-07 00:47:09.645165 | orchestrator | 2026-04-07 00:47:09.645171 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2026-04-07 00:47:09.645177 | orchestrator | Tuesday 07 April 2026 00:46:51 +0000 (0:00:01.402) 0:00:15.176 ********* 2026-04-07 00:47:09.645184 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-04-07 00:47:09.645190 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-04-07 00:47:09.645196 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-04-07 00:47:09.645203 | orchestrator | 2026-04-07 00:47:09.645209 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2026-04-07 00:47:09.645216 | orchestrator | Tuesday 07 April 2026 00:46:53 +0000 (0:00:02.117) 0:00:17.293 ********* 2026-04-07 00:47:09.645226 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-04-07 00:47:09.645232 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-04-07 00:47:09.645238 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-04-07 00:47:09.645244 | orchestrator | 2026-04-07 00:47:09.645254 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2026-04-07 00:47:09.645260 | orchestrator | Tuesday 07 April 2026 00:46:54 +0000 (0:00:01.409) 0:00:18.703 ********* 2026-04-07 00:47:09.645267 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-04-07 00:47:09.645273 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-04-07 00:47:09.645279 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-04-07 00:47:09.645286 | orchestrator | 2026-04-07 00:47:09.645292 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2026-04-07 00:47:09.645298 | orchestrator | Tuesday 07 April 2026 00:46:56 +0000 (0:00:01.351) 0:00:20.054 ********* 2026-04-07 00:47:09.645305 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-04-07 00:47:09.645311 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-04-07 00:47:09.645318 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-04-07 00:47:09.645324 | orchestrator | 2026-04-07 00:47:09.645330 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2026-04-07 00:47:09.645340 | orchestrator | Tuesday 07 April 2026 00:46:57 +0000 (0:00:01.303) 0:00:21.358 ********* 2026-04-07 00:47:09.645346 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-04-07 00:47:09.645408 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-04-07 00:47:09.645415 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-04-07 00:47:09.645421 | orchestrator | 2026-04-07 00:47:09.645428 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-04-07 00:47:09.645434 | orchestrator | Tuesday 07 April 2026 00:46:58 +0000 (0:00:01.362) 0:00:22.720 ********* 2026-04-07 00:47:09.645441 | orchestrator | included: /ansible/roles/rabbitmq/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:47:09.645448 | orchestrator | 2026-04-07 00:47:09.645455 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over extra CA certificates] ******* 2026-04-07 00:47:09.645461 | orchestrator | Tuesday 07 April 2026 00:46:59 +0000 (0:00:00.880) 0:00:23.600 ********* 2026-04-07 00:47:09.645469 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645481 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645495 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645502 | orchestrator | 2026-04-07 00:47:09.645511 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over backend internal TLS certificate] *** 2026-04-07 00:47:09.645517 | orchestrator | Tuesday 07 April 2026 00:47:01 +0000 (0:00:01.447) 0:00:25.048 ********* 2026-04-07 00:47:09.645524 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645531 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.645538 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645549 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:47:09.645561 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645568 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:47:09.645574 | orchestrator | 2026-04-07 00:47:09.645580 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over backend internal TLS key] **** 2026-04-07 00:47:09.645586 | orchestrator | Tuesday 07 April 2026 00:47:01 +0000 (0:00:00.484) 0:00:25.532 ********* 2026-04-07 00:47:09.645596 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645603 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645614 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.645621 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:47:09.645627 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645634 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:47:09.645641 | orchestrator | 2026-04-07 00:47:09.645647 | orchestrator | TASK [service-check-containers : rabbitmq | Check containers] ****************** 2026-04-07 00:47:09.645653 | orchestrator | Tuesday 07 April 2026 00:47:02 +0000 (0:00:01.134) 0:00:26.667 ********* 2026-04-07 00:47:09.645664 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645674 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645681 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:47:09.645692 | orchestrator | 2026-04-07 00:47:09.645698 | orchestrator | TASK [service-check-containers : rabbitmq | Notify handlers to restart containers] *** 2026-04-07 00:47:09.645704 | orchestrator | Tuesday 07 April 2026 00:47:03 +0000 (0:00:00.957) 0:00:27.624 ********* 2026-04-07 00:47:09.645710 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:47:09.645717 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:09.645722 | orchestrator | } 2026-04-07 00:47:09.645726 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:47:09.645730 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:09.645733 | orchestrator | } 2026-04-07 00:47:09.645737 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:47:09.645741 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:47:09.645745 | orchestrator | } 2026-04-07 00:47:09.645749 | orchestrator | 2026-04-07 00:47:09.645753 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:47:09.645756 | orchestrator | Tuesday 07 April 2026 00:47:04 +0000 (0:00:00.591) 0:00:28.215 ********* 2026-04-07 00:47:09.645765 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645769 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:47:09.645778 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645785 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:47:09.645789 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:47:09.645793 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:47:09.645797 | orchestrator | 2026-04-07 00:47:09.645801 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2026-04-07 00:47:09.645805 | orchestrator | Tuesday 07 April 2026 00:47:05 +0000 (0:00:01.007) 0:00:29.223 ********* 2026-04-07 00:47:09.645809 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:47:09.645812 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:47:09.645816 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:47:09.645820 | orchestrator | 2026-04-07 00:47:09.645824 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2026-04-07 00:47:09.645828 | orchestrator | Tuesday 07 April 2026 00:47:06 +0000 (0:00:00.865) 0:00:30.089 ********* 2026-04-07 00:47:09.645838 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Frabbitmq\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_61kjli3c/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_61kjli3c/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_61kjli3c/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Frabbitmq: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:47:09.645846 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Frabbitmq\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_b_mu8p5d/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_b_mu8p5d/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_b_mu8p5d/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Frabbitmq: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:47:09.645857 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Frabbitmq\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_scpkduih/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_scpkduih/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_scpkduih/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Frabbitmq: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:47:09.645866 | orchestrator | 2026-04-07 00:47:09.645870 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:47:09.645874 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:47:09.645878 | orchestrator | testbed-node-0 : ok=19  changed=12  unreachable=0 failed=1  skipped=9  rescued=0 ignored=0 2026-04-07 00:47:09.645883 | orchestrator | testbed-node-1 : ok=17  changed=12  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2026-04-07 00:47:09.645886 | orchestrator | testbed-node-2 : ok=17  changed=12  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2026-04-07 00:47:09.645890 | orchestrator | 2026-04-07 00:47:09.645894 | orchestrator | 2026-04-07 00:47:09.645898 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:47:09.645902 | orchestrator | Tuesday 07 April 2026 00:47:07 +0000 (0:00:01.312) 0:00:31.402 ********* 2026-04-07 00:47:09.645906 | orchestrator | =============================================================================== 2026-04-07 00:47:09.645911 | orchestrator | Check RabbitMQ service -------------------------------------------------- 3.22s 2026-04-07 00:47:09.645917 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 2.12s 2026-04-07 00:47:09.645923 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 1.62s 2026-04-07 00:47:09.645929 | orchestrator | service-cert-copy : rabbitmq | Copying over extra CA certificates ------- 1.45s 2026-04-07 00:47:09.645935 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 1.41s 2026-04-07 00:47:09.645941 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 1.40s 2026-04-07 00:47:09.645948 | orchestrator | rabbitmq : Ensuring config directories exist ---------------------------- 1.39s 2026-04-07 00:47:09.645954 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 1.36s 2026-04-07 00:47:09.645961 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 1.35s 2026-04-07 00:47:09.645967 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 1.31s 2026-04-07 00:47:09.645973 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 1.30s 2026-04-07 00:47:09.645979 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 1.16s 2026-04-07 00:47:09.645985 | orchestrator | service-cert-copy : rabbitmq | Copying over backend internal TLS key ---- 1.14s 2026-04-07 00:47:09.645991 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.01s 2026-04-07 00:47:09.645997 | orchestrator | service-check-containers : rabbitmq | Check containers ------------------ 0.96s 2026-04-07 00:47:09.646007 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 0.88s 2026-04-07 00:47:09.646056 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 0.87s 2026-04-07 00:47:09.646064 | orchestrator | rabbitmq : Check if running RabbitMQ is at most one version behind ------ 0.77s 2026-04-07 00:47:09.646071 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.68s 2026-04-07 00:47:09.646082 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 0.66s 2026-04-07 00:47:09.646089 | orchestrator | 2026-04-07 00:47:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:12.689629 | orchestrator | 2026-04-07 00:47:12 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:12.691707 | orchestrator | 2026-04-07 00:47:12 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:12.693854 | orchestrator | 2026-04-07 00:47:12 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:12.694080 | orchestrator | 2026-04-07 00:47:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:15.741211 | orchestrator | 2026-04-07 00:47:15 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:15.741310 | orchestrator | 2026-04-07 00:47:15 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:15.742416 | orchestrator | 2026-04-07 00:47:15 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:15.742475 | orchestrator | 2026-04-07 00:47:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:18.779620 | orchestrator | 2026-04-07 00:47:18 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:18.779791 | orchestrator | 2026-04-07 00:47:18 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:18.782806 | orchestrator | 2026-04-07 00:47:18 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:18.782854 | orchestrator | 2026-04-07 00:47:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:21.819051 | orchestrator | 2026-04-07 00:47:21 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:21.821653 | orchestrator | 2026-04-07 00:47:21 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:21.822265 | orchestrator | 2026-04-07 00:47:21 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:21.822317 | orchestrator | 2026-04-07 00:47:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:24.848907 | orchestrator | 2026-04-07 00:47:24 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:24.850786 | orchestrator | 2026-04-07 00:47:24 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:24.852613 | orchestrator | 2026-04-07 00:47:24 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:24.852644 | orchestrator | 2026-04-07 00:47:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:27.896962 | orchestrator | 2026-04-07 00:47:27 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:27.898184 | orchestrator | 2026-04-07 00:47:27 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:27.899840 | orchestrator | 2026-04-07 00:47:27 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:27.899861 | orchestrator | 2026-04-07 00:47:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:30.932511 | orchestrator | 2026-04-07 00:47:30 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:30.935809 | orchestrator | 2026-04-07 00:47:30 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:30.936625 | orchestrator | 2026-04-07 00:47:30 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:30.937603 | orchestrator | 2026-04-07 00:47:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:33.983140 | orchestrator | 2026-04-07 00:47:33 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:33.983573 | orchestrator | 2026-04-07 00:47:33 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:33.985996 | orchestrator | 2026-04-07 00:47:33 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:33.986120 | orchestrator | 2026-04-07 00:47:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:37.035869 | orchestrator | 2026-04-07 00:47:37 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:37.037402 | orchestrator | 2026-04-07 00:47:37 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:37.040530 | orchestrator | 2026-04-07 00:47:37 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:37.040590 | orchestrator | 2026-04-07 00:47:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:40.088818 | orchestrator | 2026-04-07 00:47:40 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:40.089616 | orchestrator | 2026-04-07 00:47:40 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:40.091088 | orchestrator | 2026-04-07 00:47:40 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:40.091159 | orchestrator | 2026-04-07 00:47:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:43.132059 | orchestrator | 2026-04-07 00:47:43 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:43.134588 | orchestrator | 2026-04-07 00:47:43 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:43.137032 | orchestrator | 2026-04-07 00:47:43 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:43.137087 | orchestrator | 2026-04-07 00:47:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:46.181683 | orchestrator | 2026-04-07 00:47:46 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:46.181756 | orchestrator | 2026-04-07 00:47:46 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:46.182403 | orchestrator | 2026-04-07 00:47:46 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:46.182527 | orchestrator | 2026-04-07 00:47:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:49.210412 | orchestrator | 2026-04-07 00:47:49 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:49.210908 | orchestrator | 2026-04-07 00:47:49 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:49.211649 | orchestrator | 2026-04-07 00:47:49 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:49.211711 | orchestrator | 2026-04-07 00:47:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:52.239409 | orchestrator | 2026-04-07 00:47:52 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:52.239870 | orchestrator | 2026-04-07 00:47:52 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:52.240584 | orchestrator | 2026-04-07 00:47:52 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:52.240616 | orchestrator | 2026-04-07 00:47:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:55.291266 | orchestrator | 2026-04-07 00:47:55 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:55.291404 | orchestrator | 2026-04-07 00:47:55 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:55.291416 | orchestrator | 2026-04-07 00:47:55 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:55.291424 | orchestrator | 2026-04-07 00:47:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:47:58.322689 | orchestrator | 2026-04-07 00:47:58 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:47:58.325195 | orchestrator | 2026-04-07 00:47:58 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:47:58.325476 | orchestrator | 2026-04-07 00:47:58 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:47:58.325693 | orchestrator | 2026-04-07 00:47:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:01.379733 | orchestrator | 2026-04-07 00:48:01 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:01.380234 | orchestrator | 2026-04-07 00:48:01 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:01.384170 | orchestrator | 2026-04-07 00:48:01 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:01.384243 | orchestrator | 2026-04-07 00:48:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:04.445709 | orchestrator | 2026-04-07 00:48:04 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:04.447759 | orchestrator | 2026-04-07 00:48:04 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:04.448027 | orchestrator | 2026-04-07 00:48:04 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:04.450008 | orchestrator | 2026-04-07 00:48:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:07.480292 | orchestrator | 2026-04-07 00:48:07 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:07.485733 | orchestrator | 2026-04-07 00:48:07 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:07.485780 | orchestrator | 2026-04-07 00:48:07 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:07.485790 | orchestrator | 2026-04-07 00:48:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:10.525007 | orchestrator | 2026-04-07 00:48:10 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:10.525569 | orchestrator | 2026-04-07 00:48:10 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:10.527728 | orchestrator | 2026-04-07 00:48:10 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:10.527765 | orchestrator | 2026-04-07 00:48:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:13.562494 | orchestrator | 2026-04-07 00:48:13 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:13.564338 | orchestrator | 2026-04-07 00:48:13 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:13.566308 | orchestrator | 2026-04-07 00:48:13 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:13.566367 | orchestrator | 2026-04-07 00:48:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:16.603284 | orchestrator | 2026-04-07 00:48:16 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:16.604942 | orchestrator | 2026-04-07 00:48:16 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:16.605504 | orchestrator | 2026-04-07 00:48:16 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:16.605540 | orchestrator | 2026-04-07 00:48:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:19.638574 | orchestrator | 2026-04-07 00:48:19 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:19.639195 | orchestrator | 2026-04-07 00:48:19 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:19.640006 | orchestrator | 2026-04-07 00:48:19 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:19.640026 | orchestrator | 2026-04-07 00:48:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:22.692504 | orchestrator | 2026-04-07 00:48:22 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:22.694042 | orchestrator | 2026-04-07 00:48:22 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:22.694950 | orchestrator | 2026-04-07 00:48:22 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:22.695118 | orchestrator | 2026-04-07 00:48:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:25.742171 | orchestrator | 2026-04-07 00:48:25 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:25.745125 | orchestrator | 2026-04-07 00:48:25 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:25.745194 | orchestrator | 2026-04-07 00:48:25 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:25.745229 | orchestrator | 2026-04-07 00:48:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:28.793414 | orchestrator | 2026-04-07 00:48:28 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:28.793947 | orchestrator | 2026-04-07 00:48:28 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:28.794807 | orchestrator | 2026-04-07 00:48:28 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:28.794831 | orchestrator | 2026-04-07 00:48:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:31.834668 | orchestrator | 2026-04-07 00:48:31 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:31.835612 | orchestrator | 2026-04-07 00:48:31 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:31.836851 | orchestrator | 2026-04-07 00:48:31 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:31.837024 | orchestrator | 2026-04-07 00:48:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:34.873985 | orchestrator | 2026-04-07 00:48:34 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:34.874675 | orchestrator | 2026-04-07 00:48:34 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:34.875677 | orchestrator | 2026-04-07 00:48:34 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:34.875701 | orchestrator | 2026-04-07 00:48:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:37.906201 | orchestrator | 2026-04-07 00:48:37 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:37.906435 | orchestrator | 2026-04-07 00:48:37 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:37.907361 | orchestrator | 2026-04-07 00:48:37 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:37.907391 | orchestrator | 2026-04-07 00:48:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:40.942990 | orchestrator | 2026-04-07 00:48:40 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:40.945241 | orchestrator | 2026-04-07 00:48:40 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:40.945472 | orchestrator | 2026-04-07 00:48:40 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:40.945626 | orchestrator | 2026-04-07 00:48:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:43.988054 | orchestrator | 2026-04-07 00:48:43 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:43.988648 | orchestrator | 2026-04-07 00:48:43 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:43.989600 | orchestrator | 2026-04-07 00:48:43 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:43.989638 | orchestrator | 2026-04-07 00:48:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:47.036772 | orchestrator | 2026-04-07 00:48:47 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:47.038557 | orchestrator | 2026-04-07 00:48:47 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:47.041220 | orchestrator | 2026-04-07 00:48:47 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:47.041270 | orchestrator | 2026-04-07 00:48:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:50.090426 | orchestrator | 2026-04-07 00:48:50 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:50.093048 | orchestrator | 2026-04-07 00:48:50 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:50.093198 | orchestrator | 2026-04-07 00:48:50 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:50.093471 | orchestrator | 2026-04-07 00:48:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:53.131985 | orchestrator | 2026-04-07 00:48:53 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:53.132930 | orchestrator | 2026-04-07 00:48:53 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:53.133791 | orchestrator | 2026-04-07 00:48:53 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:53.133845 | orchestrator | 2026-04-07 00:48:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:56.164487 | orchestrator | 2026-04-07 00:48:56 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:56.167699 | orchestrator | 2026-04-07 00:48:56 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:56.168085 | orchestrator | 2026-04-07 00:48:56 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:56.168342 | orchestrator | 2026-04-07 00:48:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:48:59.194650 | orchestrator | 2026-04-07 00:48:59 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:48:59.194832 | orchestrator | 2026-04-07 00:48:59 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:48:59.195454 | orchestrator | 2026-04-07 00:48:59 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:48:59.195626 | orchestrator | 2026-04-07 00:48:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:02.229432 | orchestrator | 2026-04-07 00:49:02 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:49:02.229859 | orchestrator | 2026-04-07 00:49:02 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:02.230540 | orchestrator | 2026-04-07 00:49:02 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:02.231230 | orchestrator | 2026-04-07 00:49:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:05.278597 | orchestrator | 2026-04-07 00:49:05 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:49:05.279007 | orchestrator | 2026-04-07 00:49:05 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:05.279750 | orchestrator | 2026-04-07 00:49:05 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:05.279774 | orchestrator | 2026-04-07 00:49:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:08.306144 | orchestrator | 2026-04-07 00:49:08 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:49:08.306485 | orchestrator | 2026-04-07 00:49:08 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:08.310067 | orchestrator | 2026-04-07 00:49:08 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:08.310124 | orchestrator | 2026-04-07 00:49:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:11.344020 | orchestrator | 2026-04-07 00:49:11 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:49:11.344370 | orchestrator | 2026-04-07 00:49:11 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:11.345393 | orchestrator | 2026-04-07 00:49:11 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:11.345447 | orchestrator | 2026-04-07 00:49:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:14.381843 | orchestrator | 2026-04-07 00:49:14 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state STARTED 2026-04-07 00:49:14.383702 | orchestrator | 2026-04-07 00:49:14 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:14.385768 | orchestrator | 2026-04-07 00:49:14 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:14.385939 | orchestrator | 2026-04-07 00:49:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:17.407820 | orchestrator | 2026-04-07 00:49:17 | INFO  | Task f71c121b-9c67-449d-9903-1f6e9f8af346 is in state SUCCESS 2026-04-07 00:49:17.408753 | orchestrator | 2026-04-07 00:49:17.408826 | orchestrator | 2026-04-07 00:49:17.408840 | orchestrator | PLAY [Prepare all k3s nodes] *************************************************** 2026-04-07 00:49:17.408852 | orchestrator | 2026-04-07 00:49:17.408863 | orchestrator | TASK [k3s_prereq : Validating arguments against arg spec 'main' - Prerequisites] *** 2026-04-07 00:49:17.408875 | orchestrator | Tuesday 07 April 2026 00:45:03 +0000 (0:00:00.289) 0:00:00.289 ********* 2026-04-07 00:49:17.408887 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:49:17.408899 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:49:17.408911 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:49:17.408922 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.408933 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.408973 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.408984 | orchestrator | 2026-04-07 00:49:17.408997 | orchestrator | TASK [k3s_prereq : Set same timezone on every Server] ************************** 2026-04-07 00:49:17.409016 | orchestrator | Tuesday 07 April 2026 00:45:04 +0000 (0:00:00.580) 0:00:00.869 ********* 2026-04-07 00:49:17.409035 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.409053 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.409071 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.409088 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.409105 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.409121 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.409139 | orchestrator | 2026-04-07 00:49:17.409157 | orchestrator | TASK [k3s_prereq : Set SELinux to disabled state] ****************************** 2026-04-07 00:49:17.409176 | orchestrator | Tuesday 07 April 2026 00:45:04 +0000 (0:00:00.691) 0:00:01.561 ********* 2026-04-07 00:49:17.409195 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.409215 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.409232 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.409251 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.409265 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.409276 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.409286 | orchestrator | 2026-04-07 00:49:17.409324 | orchestrator | TASK [k3s_prereq : Enable IPv4 forwarding] ************************************* 2026-04-07 00:49:17.409372 | orchestrator | Tuesday 07 April 2026 00:45:05 +0000 (0:00:00.730) 0:00:02.292 ********* 2026-04-07 00:49:17.409387 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.409400 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.409413 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.409425 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.409438 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.409450 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.409463 | orchestrator | 2026-04-07 00:49:17.409502 | orchestrator | TASK [k3s_prereq : Enable IPv6 forwarding] ************************************* 2026-04-07 00:49:17.409516 | orchestrator | Tuesday 07 April 2026 00:45:07 +0000 (0:00:01.852) 0:00:04.144 ********* 2026-04-07 00:49:17.409529 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.409541 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.409554 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.409566 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.409579 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.409592 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.409605 | orchestrator | 2026-04-07 00:49:17.409617 | orchestrator | TASK [k3s_prereq : Enable IPv6 router advertisements] ************************** 2026-04-07 00:49:17.409630 | orchestrator | Tuesday 07 April 2026 00:45:09 +0000 (0:00:02.102) 0:00:06.247 ********* 2026-04-07 00:49:17.409643 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.409655 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.409668 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.409681 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.409694 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.409708 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.409720 | orchestrator | 2026-04-07 00:49:17.409732 | orchestrator | TASK [k3s_prereq : Add br_netfilter to /etc/modules-load.d/] ******************* 2026-04-07 00:49:17.409743 | orchestrator | Tuesday 07 April 2026 00:45:11 +0000 (0:00:01.919) 0:00:08.166 ********* 2026-04-07 00:49:17.409754 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.409765 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.409776 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.409787 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.409797 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.409808 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.409819 | orchestrator | 2026-04-07 00:49:17.409830 | orchestrator | TASK [k3s_prereq : Load br_netfilter] ****************************************** 2026-04-07 00:49:17.409854 | orchestrator | Tuesday 07 April 2026 00:45:12 +0000 (0:00:01.407) 0:00:09.573 ********* 2026-04-07 00:49:17.409865 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.409876 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.409887 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.409898 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.409909 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.409920 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.409931 | orchestrator | 2026-04-07 00:49:17.409942 | orchestrator | TASK [k3s_prereq : Set bridge-nf-call-iptables (just to be sure)] ************** 2026-04-07 00:49:17.409953 | orchestrator | Tuesday 07 April 2026 00:45:13 +0000 (0:00:00.913) 0:00:10.487 ********* 2026-04-07 00:49:17.410007 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-07 00:49:17.410101 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-07 00:49:17.410113 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.410124 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-07 00:49:17.410135 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-07 00:49:17.410146 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.410157 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-07 00:49:17.410168 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-07 00:49:17.410179 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.410190 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-07 00:49:17.410220 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-07 00:49:17.410231 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.410243 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-07 00:49:17.410254 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-07 00:49:17.410265 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.410276 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-07 00:49:17.410287 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-07 00:49:17.410380 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.410395 | orchestrator | 2026-04-07 00:49:17.410406 | orchestrator | TASK [k3s_prereq : Add /usr/local/bin to sudo secure_path] ********************* 2026-04-07 00:49:17.410417 | orchestrator | Tuesday 07 April 2026 00:45:14 +0000 (0:00:00.984) 0:00:11.472 ********* 2026-04-07 00:49:17.410428 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.410439 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.410450 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.410461 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.410472 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.410483 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.410494 | orchestrator | 2026-04-07 00:49:17.410505 | orchestrator | TASK [k3s_download : Validating arguments against arg spec 'main' - Manage the downloading of K3S binaries] *** 2026-04-07 00:49:17.410517 | orchestrator | Tuesday 07 April 2026 00:45:16 +0000 (0:00:01.444) 0:00:12.916 ********* 2026-04-07 00:49:17.410528 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:49:17.410539 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:49:17.410550 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:49:17.410561 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.410571 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.410582 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.410593 | orchestrator | 2026-04-07 00:49:17.410604 | orchestrator | TASK [k3s_download : Download k3s binary x64] ********************************** 2026-04-07 00:49:17.410615 | orchestrator | Tuesday 07 April 2026 00:45:17 +0000 (0:00:01.127) 0:00:14.044 ********* 2026-04-07 00:49:17.410637 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.410648 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.410659 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.410669 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.410680 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.410691 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.410702 | orchestrator | 2026-04-07 00:49:17.410713 | orchestrator | TASK [k3s_download : Download k3s binary arm64] ******************************** 2026-04-07 00:49:17.410724 | orchestrator | Tuesday 07 April 2026 00:45:24 +0000 (0:00:06.796) 0:00:20.840 ********* 2026-04-07 00:49:17.410735 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.410745 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.410756 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.410767 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.410778 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.410789 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.410800 | orchestrator | 2026-04-07 00:49:17.410811 | orchestrator | TASK [k3s_download : Download k3s binary armhf] ******************************** 2026-04-07 00:49:17.410829 | orchestrator | Tuesday 07 April 2026 00:45:25 +0000 (0:00:01.658) 0:00:22.498 ********* 2026-04-07 00:49:17.410840 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.410851 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.410862 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.410873 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.410884 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.410895 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.410906 | orchestrator | 2026-04-07 00:49:17.410917 | orchestrator | TASK [k3s_custom_registries : Validating arguments against arg spec 'main' - Configure the use of a custom container registry] *** 2026-04-07 00:49:17.410930 | orchestrator | Tuesday 07 April 2026 00:45:27 +0000 (0:00:01.390) 0:00:23.889 ********* 2026-04-07 00:49:17.410940 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.410951 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.410962 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.410973 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.410984 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.410995 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.411005 | orchestrator | 2026-04-07 00:49:17.411017 | orchestrator | TASK [k3s_custom_registries : Create directory /etc/rancher/k3s] *************** 2026-04-07 00:49:17.411028 | orchestrator | Tuesday 07 April 2026 00:45:28 +0000 (0:00:01.027) 0:00:24.916 ********* 2026-04-07 00:49:17.411062 | orchestrator | skipping: [testbed-node-3] => (item=rancher)  2026-04-07 00:49:17.411075 | orchestrator | skipping: [testbed-node-3] => (item=rancher/k3s)  2026-04-07 00:49:17.411086 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.411097 | orchestrator | skipping: [testbed-node-4] => (item=rancher)  2026-04-07 00:49:17.411109 | orchestrator | skipping: [testbed-node-5] => (item=rancher)  2026-04-07 00:49:17.411120 | orchestrator | skipping: [testbed-node-4] => (item=rancher/k3s)  2026-04-07 00:49:17.411131 | orchestrator | skipping: [testbed-node-5] => (item=rancher/k3s)  2026-04-07 00:49:17.411142 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.411153 | orchestrator | skipping: [testbed-node-0] => (item=rancher)  2026-04-07 00:49:17.411164 | orchestrator | skipping: [testbed-node-0] => (item=rancher/k3s)  2026-04-07 00:49:17.411176 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.411187 | orchestrator | skipping: [testbed-node-1] => (item=rancher)  2026-04-07 00:49:17.411198 | orchestrator | skipping: [testbed-node-1] => (item=rancher/k3s)  2026-04-07 00:49:17.411208 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.411219 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.411231 | orchestrator | skipping: [testbed-node-2] => (item=rancher)  2026-04-07 00:49:17.411250 | orchestrator | skipping: [testbed-node-2] => (item=rancher/k3s)  2026-04-07 00:49:17.411267 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.411326 | orchestrator | 2026-04-07 00:49:17.411346 | orchestrator | TASK [k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml] *** 2026-04-07 00:49:17.411376 | orchestrator | Tuesday 07 April 2026 00:45:28 +0000 (0:00:00.683) 0:00:25.600 ********* 2026-04-07 00:49:17.411389 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.411401 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.411412 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.411423 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.411434 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.411445 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.411456 | orchestrator | 2026-04-07 00:49:17.411467 | orchestrator | TASK [k3s_custom_registries : Remove /etc/rancher/k3s/registries.yaml when no registries configured] *** 2026-04-07 00:49:17.411478 | orchestrator | Tuesday 07 April 2026 00:45:29 +0000 (0:00:01.081) 0:00:26.681 ********* 2026-04-07 00:49:17.411490 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.411501 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.411512 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.411523 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.411534 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.411545 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.411556 | orchestrator | 2026-04-07 00:49:17.411567 | orchestrator | PLAY [Deploy k3s master nodes] ************************************************* 2026-04-07 00:49:17.411578 | orchestrator | 2026-04-07 00:49:17.411590 | orchestrator | TASK [k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers] *** 2026-04-07 00:49:17.411601 | orchestrator | Tuesday 07 April 2026 00:45:31 +0000 (0:00:01.234) 0:00:27.916 ********* 2026-04-07 00:49:17.411612 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.411624 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.411635 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.411646 | orchestrator | 2026-04-07 00:49:17.411657 | orchestrator | TASK [k3s_server : Stop k3s-init] ********************************************** 2026-04-07 00:49:17.411668 | orchestrator | Tuesday 07 April 2026 00:45:32 +0000 (0:00:01.176) 0:00:29.093 ********* 2026-04-07 00:49:17.411680 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.411691 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.411702 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.411713 | orchestrator | 2026-04-07 00:49:17.411724 | orchestrator | TASK [k3s_server : Stop k3s] *************************************************** 2026-04-07 00:49:17.411735 | orchestrator | Tuesday 07 April 2026 00:45:33 +0000 (0:00:01.057) 0:00:30.151 ********* 2026-04-07 00:49:17.411746 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.411757 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.411769 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.411779 | orchestrator | 2026-04-07 00:49:17.411790 | orchestrator | TASK [k3s_server : Clean previous runs of k3s-init] **************************** 2026-04-07 00:49:17.411815 | orchestrator | Tuesday 07 April 2026 00:45:34 +0000 (0:00:00.948) 0:00:31.099 ********* 2026-04-07 00:49:17.411827 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.411850 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.411861 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.411872 | orchestrator | 2026-04-07 00:49:17.411884 | orchestrator | TASK [k3s_server : Deploy K3s http_proxy conf] ********************************* 2026-04-07 00:49:17.411895 | orchestrator | Tuesday 07 April 2026 00:45:35 +0000 (0:00:00.912) 0:00:32.012 ********* 2026-04-07 00:49:17.411906 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.411917 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.411928 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.411939 | orchestrator | 2026-04-07 00:49:17.411950 | orchestrator | TASK [k3s_server : Create /etc/rancher/k3s directory] ************************** 2026-04-07 00:49:17.411968 | orchestrator | Tuesday 07 April 2026 00:45:35 +0000 (0:00:00.288) 0:00:32.300 ********* 2026-04-07 00:49:17.411979 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.411991 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.412002 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.412021 | orchestrator | 2026-04-07 00:49:17.412032 | orchestrator | TASK [k3s_server : Create custom resolv.conf for k3s] ************************** 2026-04-07 00:49:17.412043 | orchestrator | Tuesday 07 April 2026 00:45:36 +0000 (0:00:00.829) 0:00:33.130 ********* 2026-04-07 00:49:17.412054 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.412066 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.412077 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.412088 | orchestrator | 2026-04-07 00:49:17.412099 | orchestrator | TASK [k3s_server : Deploy vip manifest] **************************************** 2026-04-07 00:49:17.412110 | orchestrator | Tuesday 07 April 2026 00:45:37 +0000 (0:00:01.427) 0:00:34.557 ********* 2026-04-07 00:49:17.412121 | orchestrator | included: /ansible/roles/k3s_server/tasks/vip.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:49:17.412132 | orchestrator | 2026-04-07 00:49:17.412143 | orchestrator | TASK [k3s_server : Set _kube_vip_bgp_peers fact] ******************************* 2026-04-07 00:49:17.412154 | orchestrator | Tuesday 07 April 2026 00:45:38 +0000 (0:00:00.744) 0:00:35.301 ********* 2026-04-07 00:49:17.412165 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.412176 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.412187 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.412198 | orchestrator | 2026-04-07 00:49:17.412209 | orchestrator | TASK [k3s_server : Create manifests directory on first master] ***************** 2026-04-07 00:49:17.412220 | orchestrator | Tuesday 07 April 2026 00:45:40 +0000 (0:00:01.968) 0:00:37.270 ********* 2026-04-07 00:49:17.412232 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.412242 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.412253 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.412264 | orchestrator | 2026-04-07 00:49:17.412275 | orchestrator | TASK [k3s_server : Download vip rbac manifest to first master] ***************** 2026-04-07 00:49:17.412286 | orchestrator | Tuesday 07 April 2026 00:45:41 +0000 (0:00:00.683) 0:00:37.953 ********* 2026-04-07 00:49:17.412320 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.412332 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.412343 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.412354 | orchestrator | 2026-04-07 00:49:17.412365 | orchestrator | TASK [k3s_server : Copy vip manifest to first master] ************************** 2026-04-07 00:49:17.412376 | orchestrator | Tuesday 07 April 2026 00:45:42 +0000 (0:00:01.093) 0:00:39.046 ********* 2026-04-07 00:49:17.412387 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.412398 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.412412 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.412432 | orchestrator | 2026-04-07 00:49:17.412459 | orchestrator | TASK [k3s_server : Deploy metallb manifest] ************************************ 2026-04-07 00:49:17.412491 | orchestrator | Tuesday 07 April 2026 00:45:43 +0000 (0:00:01.694) 0:00:40.741 ********* 2026-04-07 00:49:17.412509 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.412529 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.412545 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.412563 | orchestrator | 2026-04-07 00:49:17.412580 | orchestrator | TASK [k3s_server : Deploy kube-vip manifest] *********************************** 2026-04-07 00:49:17.412600 | orchestrator | Tuesday 07 April 2026 00:45:44 +0000 (0:00:00.470) 0:00:41.211 ********* 2026-04-07 00:49:17.412618 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.412636 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.412654 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.412672 | orchestrator | 2026-04-07 00:49:17.412693 | orchestrator | TASK [k3s_server : Init cluster inside the transient k3s-init service] ********* 2026-04-07 00:49:17.412712 | orchestrator | Tuesday 07 April 2026 00:45:44 +0000 (0:00:00.410) 0:00:41.622 ********* 2026-04-07 00:49:17.412733 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.412752 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.412770 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.412782 | orchestrator | 2026-04-07 00:49:17.412793 | orchestrator | TASK [k3s_server : Detect Kubernetes version for label compatibility] ********** 2026-04-07 00:49:17.412827 | orchestrator | Tuesday 07 April 2026 00:45:46 +0000 (0:00:01.872) 0:00:43.494 ********* 2026-04-07 00:49:17.412838 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.412850 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.412861 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.412872 | orchestrator | 2026-04-07 00:49:17.412883 | orchestrator | TASK [k3s_server : Set node role label selector based on Kubernetes version] *** 2026-04-07 00:49:17.412894 | orchestrator | Tuesday 07 April 2026 00:45:49 +0000 (0:00:02.907) 0:00:46.401 ********* 2026-04-07 00:49:17.412905 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.412916 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.412927 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.412938 | orchestrator | 2026-04-07 00:49:17.412949 | orchestrator | TASK [k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails)] *** 2026-04-07 00:49:17.412960 | orchestrator | Tuesday 07 April 2026 00:45:50 +0000 (0:00:00.427) 0:00:46.829 ********* 2026-04-07 00:49:17.412972 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-04-07 00:49:17.412984 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-04-07 00:49:17.412995 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-04-07 00:49:17.413006 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-04-07 00:49:17.413018 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-04-07 00:49:17.413036 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-04-07 00:49:17.413048 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-04-07 00:49:17.413059 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-04-07 00:49:17.413070 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-04-07 00:49:17.413081 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-04-07 00:49:17.413092 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-04-07 00:49:17.413103 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-04-07 00:49:17.413115 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.413126 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.413137 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.413148 | orchestrator | 2026-04-07 00:49:17.413159 | orchestrator | TASK [k3s_server : Save logs of k3s-init.service] ****************************** 2026-04-07 00:49:17.413170 | orchestrator | Tuesday 07 April 2026 00:46:33 +0000 (0:00:43.939) 0:01:30.768 ********* 2026-04-07 00:49:17.413181 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.413193 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.413204 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.413215 | orchestrator | 2026-04-07 00:49:17.413226 | orchestrator | TASK [k3s_server : Kill the temporary service used for initialization] ********* 2026-04-07 00:49:17.413237 | orchestrator | Tuesday 07 April 2026 00:46:34 +0000 (0:00:00.260) 0:01:31.029 ********* 2026-04-07 00:49:17.413248 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413259 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413279 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413290 | orchestrator | 2026-04-07 00:49:17.413359 | orchestrator | TASK [k3s_server : Copy K3s service file] ************************************** 2026-04-07 00:49:17.413371 | orchestrator | Tuesday 07 April 2026 00:46:35 +0000 (0:00:01.296) 0:01:32.325 ********* 2026-04-07 00:49:17.413383 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413394 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413405 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413416 | orchestrator | 2026-04-07 00:49:17.413437 | orchestrator | TASK [k3s_server : Enable and check K3s service] ******************************* 2026-04-07 00:49:17.413449 | orchestrator | Tuesday 07 April 2026 00:46:36 +0000 (0:00:01.347) 0:01:33.673 ********* 2026-04-07 00:49:17.413460 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413472 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413483 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413494 | orchestrator | 2026-04-07 00:49:17.413505 | orchestrator | TASK [k3s_server : Wait for node-token] **************************************** 2026-04-07 00:49:17.413515 | orchestrator | Tuesday 07 April 2026 00:47:00 +0000 (0:00:24.112) 0:01:57.785 ********* 2026-04-07 00:49:17.413525 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.413535 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.413544 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.413554 | orchestrator | 2026-04-07 00:49:17.413564 | orchestrator | TASK [k3s_server : Register node-token file access mode] *********************** 2026-04-07 00:49:17.413574 | orchestrator | Tuesday 07 April 2026 00:47:01 +0000 (0:00:00.746) 0:01:58.532 ********* 2026-04-07 00:49:17.413584 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.413594 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.413604 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.413614 | orchestrator | 2026-04-07 00:49:17.413624 | orchestrator | TASK [k3s_server : Change file access node-token] ****************************** 2026-04-07 00:49:17.413634 | orchestrator | Tuesday 07 April 2026 00:47:02 +0000 (0:00:00.862) 0:01:59.394 ********* 2026-04-07 00:49:17.413644 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413654 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413665 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413675 | orchestrator | 2026-04-07 00:49:17.413684 | orchestrator | TASK [k3s_server : Read node-token from master] ******************************** 2026-04-07 00:49:17.413695 | orchestrator | Tuesday 07 April 2026 00:47:03 +0000 (0:00:00.673) 0:02:00.068 ********* 2026-04-07 00:49:17.413704 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.413714 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.413724 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.413734 | orchestrator | 2026-04-07 00:49:17.413744 | orchestrator | TASK [k3s_server : Store Master node-token] ************************************ 2026-04-07 00:49:17.413754 | orchestrator | Tuesday 07 April 2026 00:47:03 +0000 (0:00:00.637) 0:02:00.706 ********* 2026-04-07 00:49:17.413764 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.413774 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.413784 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.413793 | orchestrator | 2026-04-07 00:49:17.413804 | orchestrator | TASK [k3s_server : Restore node-token file access] ***************************** 2026-04-07 00:49:17.413814 | orchestrator | Tuesday 07 April 2026 00:47:04 +0000 (0:00:00.294) 0:02:01.001 ********* 2026-04-07 00:49:17.413824 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413833 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413843 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413853 | orchestrator | 2026-04-07 00:49:17.413863 | orchestrator | TASK [k3s_server : Create directory .kube] ************************************* 2026-04-07 00:49:17.413873 | orchestrator | Tuesday 07 April 2026 00:47:04 +0000 (0:00:00.563) 0:02:01.564 ********* 2026-04-07 00:49:17.413883 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413893 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413903 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413913 | orchestrator | 2026-04-07 00:49:17.413934 | orchestrator | TASK [k3s_server : Copy config file to user home directory] ******************** 2026-04-07 00:49:17.413944 | orchestrator | Tuesday 07 April 2026 00:47:05 +0000 (0:00:00.857) 0:02:02.422 ********* 2026-04-07 00:49:17.413954 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.413964 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.413973 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.413983 | orchestrator | 2026-04-07 00:49:17.413993 | orchestrator | TASK [k3s_server : Configure kubectl cluster to https://192.168.16.8:6443] ***** 2026-04-07 00:49:17.414003 | orchestrator | Tuesday 07 April 2026 00:47:06 +0000 (0:00:01.027) 0:02:03.449 ********* 2026-04-07 00:49:17.414044 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:49:17.414056 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:49:17.414066 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:49:17.414076 | orchestrator | 2026-04-07 00:49:17.414086 | orchestrator | TASK [k3s_server : Create kubectl symlink] ************************************* 2026-04-07 00:49:17.414096 | orchestrator | Tuesday 07 April 2026 00:47:07 +0000 (0:00:00.973) 0:02:04.423 ********* 2026-04-07 00:49:17.414105 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.414115 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.414125 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.414135 | orchestrator | 2026-04-07 00:49:17.414144 | orchestrator | TASK [k3s_server : Create crictl symlink] ************************************** 2026-04-07 00:49:17.414155 | orchestrator | Tuesday 07 April 2026 00:47:07 +0000 (0:00:00.291) 0:02:04.714 ********* 2026-04-07 00:49:17.414165 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.414174 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.414184 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.414194 | orchestrator | 2026-04-07 00:49:17.414204 | orchestrator | TASK [k3s_server : Get contents of manifests folder] *************************** 2026-04-07 00:49:17.414214 | orchestrator | Tuesday 07 April 2026 00:47:08 +0000 (0:00:00.434) 0:02:05.149 ********* 2026-04-07 00:49:17.414223 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.414233 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.414243 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.414253 | orchestrator | 2026-04-07 00:49:17.414263 | orchestrator | TASK [k3s_server : Get sub dirs of manifests folder] *************************** 2026-04-07 00:49:17.414273 | orchestrator | Tuesday 07 April 2026 00:47:09 +0000 (0:00:00.680) 0:02:05.829 ********* 2026-04-07 00:49:17.414283 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.414293 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.414322 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.414332 | orchestrator | 2026-04-07 00:49:17.414342 | orchestrator | TASK [k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start] *** 2026-04-07 00:49:17.414352 | orchestrator | Tuesday 07 April 2026 00:47:09 +0000 (0:00:00.709) 0:02:06.539 ********* 2026-04-07 00:49:17.414362 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-04-07 00:49:17.414379 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-04-07 00:49:17.414389 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-04-07 00:49:17.414399 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-04-07 00:49:17.414409 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-04-07 00:49:17.414419 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-04-07 00:49:17.414428 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-04-07 00:49:17.414439 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-04-07 00:49:17.414449 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-04-07 00:49:17.414466 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-04-07 00:49:17.414476 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip.yaml) 2026-04-07 00:49:17.414486 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-04-07 00:49:17.414495 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-04-07 00:49:17.414505 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip-rbac.yaml) 2026-04-07 00:49:17.414515 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-04-07 00:49:17.414524 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-04-07 00:49:17.414534 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-04-07 00:49:17.414544 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-04-07 00:49:17.414554 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-04-07 00:49:17.414563 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-04-07 00:49:17.414573 | orchestrator | 2026-04-07 00:49:17.414583 | orchestrator | PLAY [Deploy k3s worker nodes] ************************************************* 2026-04-07 00:49:17.414593 | orchestrator | 2026-04-07 00:49:17.414603 | orchestrator | TASK [k3s_agent : Validating arguments against arg spec 'main' - Setup k3s agents] *** 2026-04-07 00:49:17.414612 | orchestrator | Tuesday 07 April 2026 00:47:13 +0000 (0:00:03.899) 0:02:10.439 ********* 2026-04-07 00:49:17.414622 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:49:17.414632 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:49:17.414648 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:49:17.414658 | orchestrator | 2026-04-07 00:49:17.414668 | orchestrator | TASK [k3s_agent : Check if system is PXE-booted] ******************************* 2026-04-07 00:49:17.414678 | orchestrator | Tuesday 07 April 2026 00:47:13 +0000 (0:00:00.342) 0:02:10.781 ********* 2026-04-07 00:49:17.414688 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:49:17.414698 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:49:17.414707 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:49:17.414717 | orchestrator | 2026-04-07 00:49:17.414727 | orchestrator | TASK [k3s_agent : Set fact for PXE-booted system] ****************************** 2026-04-07 00:49:17.414737 | orchestrator | Tuesday 07 April 2026 00:47:14 +0000 (0:00:00.669) 0:02:11.450 ********* 2026-04-07 00:49:17.414747 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:49:17.414757 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:49:17.414766 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:49:17.414776 | orchestrator | 2026-04-07 00:49:17.414786 | orchestrator | TASK [k3s_agent : Include http_proxy configuration tasks] ********************** 2026-04-07 00:49:17.414796 | orchestrator | Tuesday 07 April 2026 00:47:15 +0000 (0:00:00.374) 0:02:11.825 ********* 2026-04-07 00:49:17.414806 | orchestrator | included: /ansible/roles/k3s_agent/tasks/http_proxy.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:49:17.414815 | orchestrator | 2026-04-07 00:49:17.414825 | orchestrator | TASK [k3s_agent : Create k3s-node.service.d directory] ************************* 2026-04-07 00:49:17.414835 | orchestrator | Tuesday 07 April 2026 00:47:15 +0000 (0:00:00.646) 0:02:12.471 ********* 2026-04-07 00:49:17.414845 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.414855 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.414865 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.414874 | orchestrator | 2026-04-07 00:49:17.414884 | orchestrator | TASK [k3s_agent : Copy K3s http_proxy conf file] ******************************* 2026-04-07 00:49:17.414894 | orchestrator | Tuesday 07 April 2026 00:47:15 +0000 (0:00:00.311) 0:02:12.783 ********* 2026-04-07 00:49:17.414904 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.414914 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.414931 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.414941 | orchestrator | 2026-04-07 00:49:17.414951 | orchestrator | TASK [k3s_agent : Deploy K3s http_proxy conf] ********************************** 2026-04-07 00:49:17.414961 | orchestrator | Tuesday 07 April 2026 00:47:16 +0000 (0:00:00.342) 0:02:13.126 ********* 2026-04-07 00:49:17.414971 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.414981 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.414990 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.415000 | orchestrator | 2026-04-07 00:49:17.415010 | orchestrator | TASK [k3s_agent : Create /etc/rancher/k3s directory] *************************** 2026-04-07 00:49:17.415020 | orchestrator | Tuesday 07 April 2026 00:47:16 +0000 (0:00:00.506) 0:02:13.632 ********* 2026-04-07 00:49:17.415029 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.415039 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.415049 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.415059 | orchestrator | 2026-04-07 00:49:17.415075 | orchestrator | TASK [k3s_agent : Create custom resolv.conf for k3s] *************************** 2026-04-07 00:49:17.415085 | orchestrator | Tuesday 07 April 2026 00:47:17 +0000 (0:00:00.661) 0:02:14.294 ********* 2026-04-07 00:49:17.415095 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.415105 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.415115 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.415124 | orchestrator | 2026-04-07 00:49:17.415134 | orchestrator | TASK [k3s_agent : Configure the k3s service] *********************************** 2026-04-07 00:49:17.415144 | orchestrator | Tuesday 07 April 2026 00:47:18 +0000 (0:00:01.175) 0:02:15.469 ********* 2026-04-07 00:49:17.415154 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.415164 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.415173 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.415183 | orchestrator | 2026-04-07 00:49:17.415193 | orchestrator | TASK [k3s_agent : Manage k3s service] ****************************************** 2026-04-07 00:49:17.415202 | orchestrator | Tuesday 07 April 2026 00:47:20 +0000 (0:00:01.374) 0:02:16.844 ********* 2026-04-07 00:49:17.415212 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:49:17.415222 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:49:17.415232 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:49:17.415242 | orchestrator | 2026-04-07 00:49:17.415251 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2026-04-07 00:49:17.415261 | orchestrator | 2026-04-07 00:49:17.415271 | orchestrator | TASK [Get home directory of operator user] ************************************* 2026-04-07 00:49:17.415281 | orchestrator | Tuesday 07 April 2026 00:47:29 +0000 (0:00:09.886) 0:02:26.731 ********* 2026-04-07 00:49:17.415291 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.415346 | orchestrator | 2026-04-07 00:49:17.415363 | orchestrator | TASK [Create .kube directory] ************************************************** 2026-04-07 00:49:17.415380 | orchestrator | Tuesday 07 April 2026 00:47:30 +0000 (0:00:00.773) 0:02:27.504 ********* 2026-04-07 00:49:17.415399 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.415423 | orchestrator | 2026-04-07 00:49:17.415439 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-04-07 00:49:17.415455 | orchestrator | Tuesday 07 April 2026 00:47:31 +0000 (0:00:00.422) 0:02:27.926 ********* 2026-04-07 00:49:17.415470 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-04-07 00:49:17.415485 | orchestrator | 2026-04-07 00:49:17.415499 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-04-07 00:49:17.415515 | orchestrator | Tuesday 07 April 2026 00:47:31 +0000 (0:00:00.547) 0:02:28.473 ********* 2026-04-07 00:49:17.415532 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.415548 | orchestrator | 2026-04-07 00:49:17.415564 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2026-04-07 00:49:17.415579 | orchestrator | Tuesday 07 April 2026 00:47:32 +0000 (0:00:00.781) 0:02:29.255 ********* 2026-04-07 00:49:17.415595 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.415609 | orchestrator | 2026-04-07 00:49:17.415638 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2026-04-07 00:49:17.415655 | orchestrator | Tuesday 07 April 2026 00:47:32 +0000 (0:00:00.527) 0:02:29.783 ********* 2026-04-07 00:49:17.415669 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-07 00:49:17.415684 | orchestrator | 2026-04-07 00:49:17.415708 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2026-04-07 00:49:17.415723 | orchestrator | Tuesday 07 April 2026 00:47:34 +0000 (0:00:01.675) 0:02:31.458 ********* 2026-04-07 00:49:17.415741 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-07 00:49:17.415753 | orchestrator | 2026-04-07 00:49:17.415763 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2026-04-07 00:49:17.415773 | orchestrator | Tuesday 07 April 2026 00:47:35 +0000 (0:00:00.780) 0:02:32.238 ********* 2026-04-07 00:49:17.415783 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.415792 | orchestrator | 2026-04-07 00:49:17.415802 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2026-04-07 00:49:17.415812 | orchestrator | Tuesday 07 April 2026 00:47:35 +0000 (0:00:00.331) 0:02:32.570 ********* 2026-04-07 00:49:17.415820 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.415828 | orchestrator | 2026-04-07 00:49:17.415836 | orchestrator | PLAY [Apply role kubectl] ****************************************************** 2026-04-07 00:49:17.415844 | orchestrator | 2026-04-07 00:49:17.415852 | orchestrator | TASK [kubectl : Gather variables for each operating system] ******************** 2026-04-07 00:49:17.415859 | orchestrator | Tuesday 07 April 2026 00:47:36 +0000 (0:00:00.351) 0:02:32.922 ********* 2026-04-07 00:49:17.415867 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.415875 | orchestrator | 2026-04-07 00:49:17.415883 | orchestrator | TASK [kubectl : Include distribution specific install tasks] ******************* 2026-04-07 00:49:17.415891 | orchestrator | Tuesday 07 April 2026 00:47:36 +0000 (0:00:00.135) 0:02:33.058 ********* 2026-04-07 00:49:17.415899 | orchestrator | included: /ansible/roles/kubectl/tasks/install-Debian-family.yml for testbed-manager 2026-04-07 00:49:17.415907 | orchestrator | 2026-04-07 00:49:17.415915 | orchestrator | TASK [kubectl : Remove old architecture-dependent repository] ****************** 2026-04-07 00:49:17.415922 | orchestrator | Tuesday 07 April 2026 00:47:36 +0000 (0:00:00.210) 0:02:33.268 ********* 2026-04-07 00:49:17.415930 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.415938 | orchestrator | 2026-04-07 00:49:17.415946 | orchestrator | TASK [kubectl : Install apt-transport-https package] *************************** 2026-04-07 00:49:17.415954 | orchestrator | Tuesday 07 April 2026 00:47:37 +0000 (0:00:00.687) 0:02:33.955 ********* 2026-04-07 00:49:17.415962 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.415970 | orchestrator | 2026-04-07 00:49:17.415978 | orchestrator | TASK [kubectl : Add repository gpg key] **************************************** 2026-04-07 00:49:17.415986 | orchestrator | Tuesday 07 April 2026 00:47:38 +0000 (0:00:01.457) 0:02:35.413 ********* 2026-04-07 00:49:17.415994 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.416002 | orchestrator | 2026-04-07 00:49:17.416010 | orchestrator | TASK [kubectl : Set permissions of gpg key] ************************************ 2026-04-07 00:49:17.416018 | orchestrator | Tuesday 07 April 2026 00:47:39 +0000 (0:00:00.998) 0:02:36.412 ********* 2026-04-07 00:49:17.416026 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.416034 | orchestrator | 2026-04-07 00:49:17.416049 | orchestrator | TASK [kubectl : Add repository Debian] ***************************************** 2026-04-07 00:49:17.416057 | orchestrator | Tuesday 07 April 2026 00:47:40 +0000 (0:00:00.478) 0:02:36.891 ********* 2026-04-07 00:49:17.416065 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.416073 | orchestrator | 2026-04-07 00:49:17.416082 | orchestrator | TASK [kubectl : Install required packages] ************************************* 2026-04-07 00:49:17.416089 | orchestrator | Tuesday 07 April 2026 00:47:47 +0000 (0:00:07.610) 0:02:44.501 ********* 2026-04-07 00:49:17.416097 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.416105 | orchestrator | 2026-04-07 00:49:17.416113 | orchestrator | TASK [kubectl : Remove kubectl symlink] **************************************** 2026-04-07 00:49:17.416128 | orchestrator | Tuesday 07 April 2026 00:47:59 +0000 (0:00:12.315) 0:02:56.817 ********* 2026-04-07 00:49:17.416136 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.416144 | orchestrator | 2026-04-07 00:49:17.416152 | orchestrator | PLAY [Run post actions on master nodes] **************************************** 2026-04-07 00:49:17.416159 | orchestrator | 2026-04-07 00:49:17.416167 | orchestrator | TASK [k3s_server_post : Validating arguments against arg spec 'main' - Configure k3s cluster] *** 2026-04-07 00:49:17.416175 | orchestrator | Tuesday 07 April 2026 00:48:00 +0000 (0:00:00.556) 0:02:57.373 ********* 2026-04-07 00:49:17.416183 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.416191 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.416199 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.416207 | orchestrator | 2026-04-07 00:49:17.416215 | orchestrator | TASK [k3s_server_post : Deploy calico] ***************************************** 2026-04-07 00:49:17.416223 | orchestrator | Tuesday 07 April 2026 00:48:00 +0000 (0:00:00.388) 0:02:57.762 ********* 2026-04-07 00:49:17.416230 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416238 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.416246 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.416254 | orchestrator | 2026-04-07 00:49:17.416262 | orchestrator | TASK [k3s_server_post : Deploy cilium] ***************************************** 2026-04-07 00:49:17.416270 | orchestrator | Tuesday 07 April 2026 00:48:01 +0000 (0:00:00.612) 0:02:58.374 ********* 2026-04-07 00:49:17.416278 | orchestrator | included: /ansible/roles/k3s_server_post/tasks/cilium.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:49:17.416286 | orchestrator | 2026-04-07 00:49:17.416294 | orchestrator | TASK [k3s_server_post : Create tmp directory on first master] ****************** 2026-04-07 00:49:17.416319 | orchestrator | Tuesday 07 April 2026 00:48:02 +0000 (0:00:00.572) 0:02:58.947 ********* 2026-04-07 00:49:17.416327 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416335 | orchestrator | 2026-04-07 00:49:17.416343 | orchestrator | TASK [k3s_server_post : Wait for connectivity to kube VIP] ********************* 2026-04-07 00:49:17.416351 | orchestrator | Tuesday 07 April 2026 00:48:02 +0000 (0:00:00.860) 0:02:59.807 ********* 2026-04-07 00:49:17.416359 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416367 | orchestrator | 2026-04-07 00:49:17.416375 | orchestrator | TASK [k3s_server_post : Fail if kube VIP not reachable] ************************ 2026-04-07 00:49:17.416383 | orchestrator | Tuesday 07 April 2026 00:48:03 +0000 (0:00:00.802) 0:03:00.610 ********* 2026-04-07 00:49:17.416391 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416399 | orchestrator | 2026-04-07 00:49:17.416412 | orchestrator | TASK [k3s_server_post : Test for existing Cilium install] ********************** 2026-04-07 00:49:17.416420 | orchestrator | Tuesday 07 April 2026 00:48:03 +0000 (0:00:00.114) 0:03:00.724 ********* 2026-04-07 00:49:17.416428 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416436 | orchestrator | 2026-04-07 00:49:17.416444 | orchestrator | TASK [k3s_server_post : Check Cilium version] ********************************** 2026-04-07 00:49:17.416452 | orchestrator | Tuesday 07 April 2026 00:48:04 +0000 (0:00:00.905) 0:03:01.630 ********* 2026-04-07 00:49:17.416460 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416468 | orchestrator | 2026-04-07 00:49:17.416476 | orchestrator | TASK [k3s_server_post : Parse installed Cilium version] ************************ 2026-04-07 00:49:17.416484 | orchestrator | Tuesday 07 April 2026 00:48:04 +0000 (0:00:00.102) 0:03:01.732 ********* 2026-04-07 00:49:17.416491 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416499 | orchestrator | 2026-04-07 00:49:17.416507 | orchestrator | TASK [k3s_server_post : Determine if Cilium needs update] ********************** 2026-04-07 00:49:17.416515 | orchestrator | Tuesday 07 April 2026 00:48:05 +0000 (0:00:00.098) 0:03:01.831 ********* 2026-04-07 00:49:17.416523 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416532 | orchestrator | 2026-04-07 00:49:17.416545 | orchestrator | TASK [k3s_server_post : Log result] ******************************************** 2026-04-07 00:49:17.416556 | orchestrator | Tuesday 07 April 2026 00:48:05 +0000 (0:00:00.245) 0:03:02.076 ********* 2026-04-07 00:49:17.416577 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416590 | orchestrator | 2026-04-07 00:49:17.416604 | orchestrator | TASK [k3s_server_post : Install Cilium] **************************************** 2026-04-07 00:49:17.416617 | orchestrator | Tuesday 07 April 2026 00:48:05 +0000 (0:00:00.109) 0:03:02.186 ********* 2026-04-07 00:49:17.416632 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416646 | orchestrator | 2026-04-07 00:49:17.416660 | orchestrator | TASK [k3s_server_post : Wait for Cilium resources] ***************************** 2026-04-07 00:49:17.416673 | orchestrator | Tuesday 07 April 2026 00:48:09 +0000 (0:00:04.302) 0:03:06.488 ********* 2026-04-07 00:49:17.416685 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/cilium-operator) 2026-04-07 00:49:17.416699 | orchestrator | FAILED - RETRYING: [testbed-node-0 -> localhost]: Wait for Cilium resources (30 retries left). 2026-04-07 00:49:17.416713 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=daemonset/cilium) 2026-04-07 00:49:17.416727 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/hubble-relay) 2026-04-07 00:49:17.416741 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/hubble-ui) 2026-04-07 00:49:17.416754 | orchestrator | 2026-04-07 00:49:17.416768 | orchestrator | TASK [k3s_server_post : Set _cilium_bgp_neighbors fact] ************************ 2026-04-07 00:49:17.416783 | orchestrator | Tuesday 07 April 2026 00:48:52 +0000 (0:00:42.593) 0:03:49.081 ********* 2026-04-07 00:49:17.416805 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416819 | orchestrator | 2026-04-07 00:49:17.416834 | orchestrator | TASK [k3s_server_post : Copy BGP manifests to first master] ******************** 2026-04-07 00:49:17.416848 | orchestrator | Tuesday 07 April 2026 00:48:53 +0000 (0:00:01.129) 0:03:50.211 ********* 2026-04-07 00:49:17.416861 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416873 | orchestrator | 2026-04-07 00:49:17.416886 | orchestrator | TASK [k3s_server_post : Apply BGP manifests] *********************************** 2026-04-07 00:49:17.416906 | orchestrator | Tuesday 07 April 2026 00:48:54 +0000 (0:00:01.584) 0:03:51.795 ********* 2026-04-07 00:49:17.416921 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-07 00:49:17.416933 | orchestrator | 2026-04-07 00:49:17.416946 | orchestrator | TASK [k3s_server_post : Print error message if BGP manifests application fails] *** 2026-04-07 00:49:17.416957 | orchestrator | Tuesday 07 April 2026 00:48:56 +0000 (0:00:01.114) 0:03:52.910 ********* 2026-04-07 00:49:17.416969 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.416982 | orchestrator | 2026-04-07 00:49:17.416994 | orchestrator | TASK [k3s_server_post : Test for BGP config resources] ************************* 2026-04-07 00:49:17.417007 | orchestrator | Tuesday 07 April 2026 00:48:56 +0000 (0:00:00.118) 0:03:53.029 ********* 2026-04-07 00:49:17.417020 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=kubectl get CiliumBGPPeeringPolicy.cilium.io) 2026-04-07 00:49:17.417032 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=kubectl get CiliumLoadBalancerIPPool.cilium.io) 2026-04-07 00:49:17.417046 | orchestrator | 2026-04-07 00:49:17.417059 | orchestrator | TASK [k3s_server_post : Deploy metallb pool] *********************************** 2026-04-07 00:49:17.417072 | orchestrator | Tuesday 07 April 2026 00:48:57 +0000 (0:00:01.694) 0:03:54.723 ********* 2026-04-07 00:49:17.417086 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.417100 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.417111 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.417120 | orchestrator | 2026-04-07 00:49:17.417128 | orchestrator | TASK [k3s_server_post : Remove tmp directory used for manifests] *************** 2026-04-07 00:49:17.417136 | orchestrator | Tuesday 07 April 2026 00:48:58 +0000 (0:00:00.280) 0:03:55.004 ********* 2026-04-07 00:49:17.417144 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.417152 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.417160 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.417168 | orchestrator | 2026-04-07 00:49:17.417176 | orchestrator | PLAY [Apply role k9s] ********************************************************** 2026-04-07 00:49:17.417184 | orchestrator | 2026-04-07 00:49:17.417201 | orchestrator | TASK [k9s : Gather variables for each operating system] ************************ 2026-04-07 00:49:17.417211 | orchestrator | Tuesday 07 April 2026 00:48:59 +0000 (0:00:00.979) 0:03:55.984 ********* 2026-04-07 00:49:17.417224 | orchestrator | ok: [testbed-manager] 2026-04-07 00:49:17.417245 | orchestrator | 2026-04-07 00:49:17.417259 | orchestrator | TASK [k9s : Include distribution specific install tasks] *********************** 2026-04-07 00:49:17.417272 | orchestrator | Tuesday 07 April 2026 00:48:59 +0000 (0:00:00.124) 0:03:56.109 ********* 2026-04-07 00:49:17.417284 | orchestrator | included: /ansible/roles/k9s/tasks/install-Debian-family.yml for testbed-manager 2026-04-07 00:49:17.417320 | orchestrator | 2026-04-07 00:49:17.417350 | orchestrator | TASK [k9s : Install k9s packages] ********************************************** 2026-04-07 00:49:17.417363 | orchestrator | Tuesday 07 April 2026 00:48:59 +0000 (0:00:00.193) 0:03:56.303 ********* 2026-04-07 00:49:17.417375 | orchestrator | changed: [testbed-manager] 2026-04-07 00:49:17.417386 | orchestrator | 2026-04-07 00:49:17.417398 | orchestrator | PLAY [Manage labels, annotations, and taints on all k3s nodes] ***************** 2026-04-07 00:49:17.417410 | orchestrator | 2026-04-07 00:49:17.417423 | orchestrator | TASK [Merge labels, annotations, and taints] *********************************** 2026-04-07 00:49:17.417436 | orchestrator | Tuesday 07 April 2026 00:49:04 +0000 (0:00:05.050) 0:04:01.353 ********* 2026-04-07 00:49:17.417448 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:49:17.417462 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:49:17.417477 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:49:17.417491 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:49:17.417503 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:49:17.417516 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:49:17.417529 | orchestrator | 2026-04-07 00:49:17.417542 | orchestrator | TASK [Manage labels] *********************************************************** 2026-04-07 00:49:17.417554 | orchestrator | Tuesday 07 April 2026 00:49:05 +0000 (0:00:00.648) 0:04:02.001 ********* 2026-04-07 00:49:17.417567 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-04-07 00:49:17.417580 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-04-07 00:49:17.417593 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-04-07 00:49:17.417607 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-04-07 00:49:17.417620 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-04-07 00:49:17.417633 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-04-07 00:49:17.417647 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=openstack-control-plane=enabled) 2026-04-07 00:49:17.417659 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=openstack-control-plane=enabled) 2026-04-07 00:49:17.417672 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-04-07 00:49:17.417685 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-04-07 00:49:17.417698 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=openstack-control-plane=enabled) 2026-04-07 00:49:17.417711 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-04-07 00:49:17.417739 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-04-07 00:49:17.417748 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-04-07 00:49:17.417756 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-04-07 00:49:17.417764 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-04-07 00:49:17.417772 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-04-07 00:49:17.417781 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-04-07 00:49:17.417804 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-04-07 00:49:17.417812 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-04-07 00:49:17.417820 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-04-07 00:49:17.417828 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-04-07 00:49:17.417836 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-04-07 00:49:17.417844 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-04-07 00:49:17.417852 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-04-07 00:49:17.417860 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-04-07 00:49:17.417868 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-04-07 00:49:17.417876 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-04-07 00:49:17.417884 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-04-07 00:49:17.417892 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-04-07 00:49:17.417900 | orchestrator | 2026-04-07 00:49:17.417908 | orchestrator | TASK [Manage annotations] ****************************************************** 2026-04-07 00:49:17.417916 | orchestrator | Tuesday 07 April 2026 00:49:13 +0000 (0:00:08.069) 0:04:10.071 ********* 2026-04-07 00:49:17.417924 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.417933 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.417941 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.417949 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.417957 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.417965 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.417973 | orchestrator | 2026-04-07 00:49:17.417982 | orchestrator | TASK [Manage taints] *********************************************************** 2026-04-07 00:49:17.417995 | orchestrator | Tuesday 07 April 2026 00:49:13 +0000 (0:00:00.631) 0:04:10.703 ********* 2026-04-07 00:49:17.418008 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:49:17.418078 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:49:17.418092 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:49:17.418105 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:49:17.418118 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:49:17.418132 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:49:17.418145 | orchestrator | 2026-04-07 00:49:17.418159 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:49:17.418173 | orchestrator | testbed-manager : ok=21  changed=11  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:49:17.418189 | orchestrator | testbed-node-0 : ok=50  changed=23  unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2026-04-07 00:49:17.418202 | orchestrator | testbed-node-1 : ok=38  changed=16  unreachable=0 failed=0 skipped=25  rescued=0 ignored=0 2026-04-07 00:49:17.418216 | orchestrator | testbed-node-2 : ok=38  changed=16  unreachable=0 failed=0 skipped=25  rescued=0 ignored=0 2026-04-07 00:49:17.418230 | orchestrator | testbed-node-3 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-04-07 00:49:17.418242 | orchestrator | testbed-node-4 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-04-07 00:49:17.418255 | orchestrator | testbed-node-5 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-04-07 00:49:17.418280 | orchestrator | 2026-04-07 00:49:17.418359 | orchestrator | 2026-04-07 00:49:17.418374 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:49:17.418382 | orchestrator | Tuesday 07 April 2026 00:49:14 +0000 (0:00:00.356) 0:04:11.059 ********* 2026-04-07 00:49:17.418391 | orchestrator | =============================================================================== 2026-04-07 00:49:17.418399 | orchestrator | k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails) -- 43.94s 2026-04-07 00:49:17.418407 | orchestrator | k3s_server_post : Wait for Cilium resources ---------------------------- 42.59s 2026-04-07 00:49:17.418416 | orchestrator | k3s_server : Enable and check K3s service ------------------------------ 24.11s 2026-04-07 00:49:17.418434 | orchestrator | kubectl : Install required packages ------------------------------------ 12.32s 2026-04-07 00:49:17.418442 | orchestrator | k3s_agent : Manage k3s service ------------------------------------------ 9.89s 2026-04-07 00:49:17.418450 | orchestrator | Manage labels ----------------------------------------------------------- 8.07s 2026-04-07 00:49:17.418458 | orchestrator | kubectl : Add repository Debian ----------------------------------------- 7.61s 2026-04-07 00:49:17.418466 | orchestrator | k3s_download : Download k3s binary x64 ---------------------------------- 6.80s 2026-04-07 00:49:17.418474 | orchestrator | k9s : Install k9s packages ---------------------------------------------- 5.05s 2026-04-07 00:49:17.418482 | orchestrator | k3s_server_post : Install Cilium ---------------------------------------- 4.30s 2026-04-07 00:49:17.418490 | orchestrator | k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start --- 3.90s 2026-04-07 00:49:17.418498 | orchestrator | k3s_server : Detect Kubernetes version for label compatibility ---------- 2.91s 2026-04-07 00:49:17.418506 | orchestrator | k3s_prereq : Enable IPv6 forwarding ------------------------------------- 2.10s 2026-04-07 00:49:17.418514 | orchestrator | k3s_server : Set _kube_vip_bgp_peers fact ------------------------------- 1.97s 2026-04-07 00:49:17.418522 | orchestrator | k3s_prereq : Enable IPv6 router advertisements -------------------------- 1.92s 2026-04-07 00:49:17.418530 | orchestrator | k3s_server : Init cluster inside the transient k3s-init service --------- 1.87s 2026-04-07 00:49:17.418537 | orchestrator | k3s_prereq : Enable IPv4 forwarding ------------------------------------- 1.85s 2026-04-07 00:49:17.418545 | orchestrator | k3s_server : Copy vip manifest to first master -------------------------- 1.69s 2026-04-07 00:49:17.418553 | orchestrator | k3s_server_post : Test for BGP config resources ------------------------- 1.69s 2026-04-07 00:49:17.418561 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.68s 2026-04-07 00:49:17.418570 | orchestrator | 2026-04-07 00:49:17 | INFO  | Task 6fe969df-57ee-4dfe-aa13-d16e7b5200b2 is in state STARTED 2026-04-07 00:49:17.418578 | orchestrator | 2026-04-07 00:49:17 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:17.418586 | orchestrator | 2026-04-07 00:49:17 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:17.418594 | orchestrator | 2026-04-07 00:49:17 | INFO  | Task 1e441808-6b6c-4e44-acd2-78bf3d8e00e3 is in state STARTED 2026-04-07 00:49:17.418602 | orchestrator | 2026-04-07 00:49:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:20.433363 | orchestrator | 2026-04-07 00:49:20 | INFO  | Task 6fe969df-57ee-4dfe-aa13-d16e7b5200b2 is in state SUCCESS 2026-04-07 00:49:20.434270 | orchestrator | 2026-04-07 00:49:20 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:20.434468 | orchestrator | 2026-04-07 00:49:20 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:20.435129 | orchestrator | 2026-04-07 00:49:20 | INFO  | Task 1e441808-6b6c-4e44-acd2-78bf3d8e00e3 is in state STARTED 2026-04-07 00:49:20.435184 | orchestrator | 2026-04-07 00:49:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:23.471560 | orchestrator | 2026-04-07 00:49:23 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:23.474094 | orchestrator | 2026-04-07 00:49:23 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:23.475226 | orchestrator | 2026-04-07 00:49:23 | INFO  | Task 1e441808-6b6c-4e44-acd2-78bf3d8e00e3 is in state STARTED 2026-04-07 00:49:23.475270 | orchestrator | 2026-04-07 00:49:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:26.510744 | orchestrator | 2026-04-07 00:49:26 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:26.511238 | orchestrator | 2026-04-07 00:49:26 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:26.511738 | orchestrator | 2026-04-07 00:49:26 | INFO  | Task 1e441808-6b6c-4e44-acd2-78bf3d8e00e3 is in state SUCCESS 2026-04-07 00:49:26.511749 | orchestrator | 2026-04-07 00:49:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:29.556705 | orchestrator | 2026-04-07 00:49:29 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:29.556823 | orchestrator | 2026-04-07 00:49:29 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:29.556946 | orchestrator | 2026-04-07 00:49:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:32.589457 | orchestrator | 2026-04-07 00:49:32 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:32.589526 | orchestrator | 2026-04-07 00:49:32 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:32.589534 | orchestrator | 2026-04-07 00:49:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:35.617580 | orchestrator | 2026-04-07 00:49:35 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:35.617925 | orchestrator | 2026-04-07 00:49:35 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:35.617997 | orchestrator | 2026-04-07 00:49:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:38.654582 | orchestrator | 2026-04-07 00:49:38 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:38.656228 | orchestrator | 2026-04-07 00:49:38 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:38.656269 | orchestrator | 2026-04-07 00:49:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:41.693233 | orchestrator | 2026-04-07 00:49:41 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:41.693688 | orchestrator | 2026-04-07 00:49:41 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:41.693813 | orchestrator | 2026-04-07 00:49:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:44.722228 | orchestrator | 2026-04-07 00:49:44 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:44.724248 | orchestrator | 2026-04-07 00:49:44 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:44.724336 | orchestrator | 2026-04-07 00:49:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:47.760585 | orchestrator | 2026-04-07 00:49:47 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:47.761723 | orchestrator | 2026-04-07 00:49:47 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:47.761838 | orchestrator | 2026-04-07 00:49:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:50.809678 | orchestrator | 2026-04-07 00:49:50 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:50.811565 | orchestrator | 2026-04-07 00:49:50 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:50.811991 | orchestrator | 2026-04-07 00:49:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:53.856355 | orchestrator | 2026-04-07 00:49:53 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:53.858893 | orchestrator | 2026-04-07 00:49:53 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:53.858974 | orchestrator | 2026-04-07 00:49:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:56.890313 | orchestrator | 2026-04-07 00:49:56 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:56.890396 | orchestrator | 2026-04-07 00:49:56 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:56.890407 | orchestrator | 2026-04-07 00:49:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:49:59.924968 | orchestrator | 2026-04-07 00:49:59 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:49:59.925316 | orchestrator | 2026-04-07 00:49:59 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:49:59.925342 | orchestrator | 2026-04-07 00:49:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:02.962960 | orchestrator | 2026-04-07 00:50:02 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:02.965768 | orchestrator | 2026-04-07 00:50:02 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:02.966119 | orchestrator | 2026-04-07 00:50:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:06.033924 | orchestrator | 2026-04-07 00:50:06 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:06.034537 | orchestrator | 2026-04-07 00:50:06 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:06.034558 | orchestrator | 2026-04-07 00:50:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:09.072730 | orchestrator | 2026-04-07 00:50:09 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:09.073510 | orchestrator | 2026-04-07 00:50:09 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:09.073563 | orchestrator | 2026-04-07 00:50:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:12.109931 | orchestrator | 2026-04-07 00:50:12 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:12.110055 | orchestrator | 2026-04-07 00:50:12 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:12.110069 | orchestrator | 2026-04-07 00:50:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:15.151714 | orchestrator | 2026-04-07 00:50:15 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:15.151859 | orchestrator | 2026-04-07 00:50:15 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:15.151874 | orchestrator | 2026-04-07 00:50:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:18.189648 | orchestrator | 2026-04-07 00:50:18 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:18.189751 | orchestrator | 2026-04-07 00:50:18 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:18.189759 | orchestrator | 2026-04-07 00:50:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:21.232855 | orchestrator | 2026-04-07 00:50:21 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:21.232915 | orchestrator | 2026-04-07 00:50:21 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:21.232924 | orchestrator | 2026-04-07 00:50:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:24.269089 | orchestrator | 2026-04-07 00:50:24 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:24.269976 | orchestrator | 2026-04-07 00:50:24 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:24.270513 | orchestrator | 2026-04-07 00:50:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:27.296996 | orchestrator | 2026-04-07 00:50:27 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:27.297748 | orchestrator | 2026-04-07 00:50:27 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:27.298713 | orchestrator | 2026-04-07 00:50:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:30.332254 | orchestrator | 2026-04-07 00:50:30 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:30.332906 | orchestrator | 2026-04-07 00:50:30 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:30.332947 | orchestrator | 2026-04-07 00:50:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:33.371699 | orchestrator | 2026-04-07 00:50:33 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:33.373824 | orchestrator | 2026-04-07 00:50:33 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:33.373889 | orchestrator | 2026-04-07 00:50:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:36.423139 | orchestrator | 2026-04-07 00:50:36 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:36.423250 | orchestrator | 2026-04-07 00:50:36 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:36.423361 | orchestrator | 2026-04-07 00:50:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:39.465026 | orchestrator | 2026-04-07 00:50:39 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:39.465505 | orchestrator | 2026-04-07 00:50:39 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:39.465595 | orchestrator | 2026-04-07 00:50:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:42.506369 | orchestrator | 2026-04-07 00:50:42 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:42.507648 | orchestrator | 2026-04-07 00:50:42 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:42.507716 | orchestrator | 2026-04-07 00:50:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:45.549597 | orchestrator | 2026-04-07 00:50:45 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:45.549641 | orchestrator | 2026-04-07 00:50:45 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:45.549646 | orchestrator | 2026-04-07 00:50:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:48.591389 | orchestrator | 2026-04-07 00:50:48 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:48.591461 | orchestrator | 2026-04-07 00:50:48 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:48.591469 | orchestrator | 2026-04-07 00:50:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:51.620089 | orchestrator | 2026-04-07 00:50:51 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:51.622003 | orchestrator | 2026-04-07 00:50:51 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:51.622121 | orchestrator | 2026-04-07 00:50:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:54.678007 | orchestrator | 2026-04-07 00:50:54 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:54.679006 | orchestrator | 2026-04-07 00:50:54 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:54.679075 | orchestrator | 2026-04-07 00:50:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:50:57.721191 | orchestrator | 2026-04-07 00:50:57 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:50:57.722287 | orchestrator | 2026-04-07 00:50:57 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:50:57.722330 | orchestrator | 2026-04-07 00:50:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:00.764824 | orchestrator | 2026-04-07 00:51:00 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:00.768899 | orchestrator | 2026-04-07 00:51:00 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:00.769521 | orchestrator | 2026-04-07 00:51:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:03.813729 | orchestrator | 2026-04-07 00:51:03 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:03.817380 | orchestrator | 2026-04-07 00:51:03 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:03.817435 | orchestrator | 2026-04-07 00:51:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:06.855418 | orchestrator | 2026-04-07 00:51:06 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:06.855624 | orchestrator | 2026-04-07 00:51:06 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:06.855667 | orchestrator | 2026-04-07 00:51:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:09.898957 | orchestrator | 2026-04-07 00:51:09 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:09.900690 | orchestrator | 2026-04-07 00:51:09 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:09.900864 | orchestrator | 2026-04-07 00:51:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:12.941880 | orchestrator | 2026-04-07 00:51:12 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:12.944360 | orchestrator | 2026-04-07 00:51:12 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:12.944463 | orchestrator | 2026-04-07 00:51:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:15.981446 | orchestrator | 2026-04-07 00:51:15 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:15.982159 | orchestrator | 2026-04-07 00:51:15 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:15.982258 | orchestrator | 2026-04-07 00:51:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:19.034848 | orchestrator | 2026-04-07 00:51:19 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:19.036898 | orchestrator | 2026-04-07 00:51:19 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:19.038974 | orchestrator | 2026-04-07 00:51:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:22.077145 | orchestrator | 2026-04-07 00:51:22 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:22.078002 | orchestrator | 2026-04-07 00:51:22 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:22.078065 | orchestrator | 2026-04-07 00:51:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:25.137922 | orchestrator | 2026-04-07 00:51:25 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:25.137988 | orchestrator | 2026-04-07 00:51:25 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:25.138084 | orchestrator | 2026-04-07 00:51:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:28.178175 | orchestrator | 2026-04-07 00:51:28 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:28.178404 | orchestrator | 2026-04-07 00:51:28 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:28.180885 | orchestrator | 2026-04-07 00:51:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:31.219031 | orchestrator | 2026-04-07 00:51:31 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:31.219729 | orchestrator | 2026-04-07 00:51:31 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:31.219751 | orchestrator | 2026-04-07 00:51:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:34.260111 | orchestrator | 2026-04-07 00:51:34 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state STARTED 2026-04-07 00:51:34.261011 | orchestrator | 2026-04-07 00:51:34 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:34.261044 | orchestrator | 2026-04-07 00:51:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:37.307599 | orchestrator | 2026-04-07 00:51:37 | INFO  | Task 6a3ef287-97df-41b3-bf57-d2b4bc802644 is in state SUCCESS 2026-04-07 00:51:37.309097 | orchestrator | 2026-04-07 00:51:37.309172 | orchestrator | 2026-04-07 00:51:37.309182 | orchestrator | PLAY [Copy kubeconfig to the configuration repository] ************************* 2026-04-07 00:51:37.309191 | orchestrator | 2026-04-07 00:51:37.309198 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-04-07 00:51:37.309206 | orchestrator | Tuesday 07 April 2026 00:49:16 +0000 (0:00:00.164) 0:00:00.164 ********* 2026-04-07 00:51:37.309213 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-04-07 00:51:37.309220 | orchestrator | 2026-04-07 00:51:37.309227 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-04-07 00:51:37.309233 | orchestrator | Tuesday 07 April 2026 00:49:18 +0000 (0:00:01.155) 0:00:01.320 ********* 2026-04-07 00:51:37.309239 | orchestrator | changed: [testbed-manager] 2026-04-07 00:51:37.309246 | orchestrator | 2026-04-07 00:51:37.309253 | orchestrator | TASK [Change server address in the kubeconfig file] **************************** 2026-04-07 00:51:37.309259 | orchestrator | Tuesday 07 April 2026 00:49:19 +0000 (0:00:01.426) 0:00:02.746 ********* 2026-04-07 00:51:37.309265 | orchestrator | changed: [testbed-manager] 2026-04-07 00:51:37.309271 | orchestrator | 2026-04-07 00:51:37.309278 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:51:37.309321 | orchestrator | testbed-manager : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:51:37.309330 | orchestrator | 2026-04-07 00:51:37.309336 | orchestrator | 2026-04-07 00:51:37.309360 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:51:37.309367 | orchestrator | Tuesday 07 April 2026 00:49:19 +0000 (0:00:00.477) 0:00:03.224 ********* 2026-04-07 00:51:37.309374 | orchestrator | =============================================================================== 2026-04-07 00:51:37.309380 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.43s 2026-04-07 00:51:37.309387 | orchestrator | Get kubeconfig file ----------------------------------------------------- 1.16s 2026-04-07 00:51:37.309393 | orchestrator | Change server address in the kubeconfig file ---------------------------- 0.48s 2026-04-07 00:51:37.309399 | orchestrator | 2026-04-07 00:51:37.309405 | orchestrator | 2026-04-07 00:51:37.309411 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2026-04-07 00:51:37.309417 | orchestrator | 2026-04-07 00:51:37.309496 | orchestrator | TASK [Get home directory of operator user] ************************************* 2026-04-07 00:51:37.309503 | orchestrator | Tuesday 07 April 2026 00:49:17 +0000 (0:00:00.203) 0:00:00.203 ********* 2026-04-07 00:51:37.309510 | orchestrator | ok: [testbed-manager] 2026-04-07 00:51:37.309518 | orchestrator | 2026-04-07 00:51:37.309524 | orchestrator | TASK [Create .kube directory] ************************************************** 2026-04-07 00:51:37.309530 | orchestrator | Tuesday 07 April 2026 00:49:18 +0000 (0:00:00.707) 0:00:00.910 ********* 2026-04-07 00:51:37.309536 | orchestrator | ok: [testbed-manager] 2026-04-07 00:51:37.309543 | orchestrator | 2026-04-07 00:51:37.309549 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-04-07 00:51:37.309555 | orchestrator | Tuesday 07 April 2026 00:49:18 +0000 (0:00:00.543) 0:00:01.454 ********* 2026-04-07 00:51:37.309561 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-04-07 00:51:37.309568 | orchestrator | 2026-04-07 00:51:37.309574 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-04-07 00:51:37.309580 | orchestrator | Tuesday 07 April 2026 00:49:19 +0000 (0:00:01.022) 0:00:02.476 ********* 2026-04-07 00:51:37.309586 | orchestrator | changed: [testbed-manager] 2026-04-07 00:51:37.309592 | orchestrator | 2026-04-07 00:51:37.309599 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2026-04-07 00:51:37.309605 | orchestrator | Tuesday 07 April 2026 00:49:20 +0000 (0:00:00.987) 0:00:03.463 ********* 2026-04-07 00:51:37.309611 | orchestrator | changed: [testbed-manager] 2026-04-07 00:51:37.309618 | orchestrator | 2026-04-07 00:51:37.309624 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2026-04-07 00:51:37.309630 | orchestrator | Tuesday 07 April 2026 00:49:21 +0000 (0:00:00.419) 0:00:03.883 ********* 2026-04-07 00:51:37.309637 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-07 00:51:37.309643 | orchestrator | 2026-04-07 00:51:37.309649 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2026-04-07 00:51:37.309655 | orchestrator | Tuesday 07 April 2026 00:49:22 +0000 (0:00:01.627) 0:00:05.510 ********* 2026-04-07 00:51:37.309661 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-07 00:51:37.309668 | orchestrator | 2026-04-07 00:51:37.309674 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2026-04-07 00:51:37.309680 | orchestrator | Tuesday 07 April 2026 00:49:23 +0000 (0:00:00.817) 0:00:06.328 ********* 2026-04-07 00:51:37.309687 | orchestrator | ok: [testbed-manager] 2026-04-07 00:51:37.309693 | orchestrator | 2026-04-07 00:51:37.309699 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2026-04-07 00:51:37.309705 | orchestrator | Tuesday 07 April 2026 00:49:23 +0000 (0:00:00.345) 0:00:06.674 ********* 2026-04-07 00:51:37.309711 | orchestrator | ok: [testbed-manager] 2026-04-07 00:51:37.309717 | orchestrator | 2026-04-07 00:51:37.309723 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:51:37.309755 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:51:37.309762 | orchestrator | 2026-04-07 00:51:37.309768 | orchestrator | 2026-04-07 00:51:37.309773 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:51:37.309779 | orchestrator | Tuesday 07 April 2026 00:49:24 +0000 (0:00:00.243) 0:00:06.918 ********* 2026-04-07 00:51:37.309785 | orchestrator | =============================================================================== 2026-04-07 00:51:37.309791 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.63s 2026-04-07 00:51:37.309797 | orchestrator | Get kubeconfig file ----------------------------------------------------- 1.02s 2026-04-07 00:51:37.309803 | orchestrator | Write kubeconfig file --------------------------------------------------- 0.99s 2026-04-07 00:51:37.309824 | orchestrator | Change server address in the kubeconfig inside the manager service ------ 0.82s 2026-04-07 00:51:37.309831 | orchestrator | Get home directory of operator user ------------------------------------- 0.71s 2026-04-07 00:51:37.309838 | orchestrator | Create .kube directory -------------------------------------------------- 0.54s 2026-04-07 00:51:37.309844 | orchestrator | Change server address in the kubeconfig --------------------------------- 0.42s 2026-04-07 00:51:37.309850 | orchestrator | Set KUBECONFIG environment variable ------------------------------------- 0.35s 2026-04-07 00:51:37.309856 | orchestrator | Enable kubectl command line completion ---------------------------------- 0.24s 2026-04-07 00:51:37.309863 | orchestrator | 2026-04-07 00:51:37.309869 | orchestrator | 2026-04-07 00:51:37.309875 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:51:37.309881 | orchestrator | 2026-04-07 00:51:37.309888 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:51:37.309894 | orchestrator | Tuesday 07 April 2026 00:46:15 +0000 (0:00:00.717) 0:00:00.719 ********* 2026-04-07 00:51:37.309900 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.309907 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.309913 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.309919 | orchestrator | 2026-04-07 00:51:37.309926 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:51:37.309938 | orchestrator | Tuesday 07 April 2026 00:46:16 +0000 (0:00:00.929) 0:00:01.649 ********* 2026-04-07 00:51:37.309944 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2026-04-07 00:51:37.309951 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2026-04-07 00:51:37.309957 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2026-04-07 00:51:37.309963 | orchestrator | 2026-04-07 00:51:37.309969 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2026-04-07 00:51:37.309976 | orchestrator | 2026-04-07 00:51:37.309982 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2026-04-07 00:51:37.309988 | orchestrator | Tuesday 07 April 2026 00:46:17 +0000 (0:00:01.160) 0:00:02.809 ********* 2026-04-07 00:51:37.309995 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.310002 | orchestrator | 2026-04-07 00:51:37.310008 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2026-04-07 00:51:37.310059 | orchestrator | Tuesday 07 April 2026 00:46:19 +0000 (0:00:01.171) 0:00:03.980 ********* 2026-04-07 00:51:37.310067 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.310073 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.310080 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.310087 | orchestrator | 2026-04-07 00:51:37.310094 | orchestrator | TASK [Setting sysctl values] *************************************************** 2026-04-07 00:51:37.310100 | orchestrator | Tuesday 07 April 2026 00:46:20 +0000 (0:00:01.824) 0:00:05.805 ********* 2026-04-07 00:51:37.310107 | orchestrator | included: sysctl for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.310114 | orchestrator | 2026-04-07 00:51:37.310120 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2026-04-07 00:51:37.310133 | orchestrator | Tuesday 07 April 2026 00:46:22 +0000 (0:00:01.364) 0:00:07.169 ********* 2026-04-07 00:51:37.310139 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.310145 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.310152 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.310159 | orchestrator | 2026-04-07 00:51:37.310165 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2026-04-07 00:51:37.310171 | orchestrator | Tuesday 07 April 2026 00:46:24 +0000 (0:00:02.083) 0:00:09.253 ********* 2026-04-07 00:51:37.310178 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-04-07 00:51:37.310185 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-04-07 00:51:37.310191 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-04-07 00:51:37.310198 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-04-07 00:51:37.310205 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-04-07 00:51:37.310212 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-04-07 00:51:37.310219 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-04-07 00:51:37.310225 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-04-07 00:51:37.310232 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-04-07 00:51:37.310238 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-04-07 00:51:37.310245 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-04-07 00:51:37.310252 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-04-07 00:51:37.310258 | orchestrator | 2026-04-07 00:51:37.310264 | orchestrator | TASK [module-load : Load modules] ********************************************** 2026-04-07 00:51:37.310271 | orchestrator | Tuesday 07 April 2026 00:46:30 +0000 (0:00:05.906) 0:00:15.159 ********* 2026-04-07 00:51:37.310278 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2026-04-07 00:51:37.310285 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2026-04-07 00:51:37.310291 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2026-04-07 00:51:37.310298 | orchestrator | 2026-04-07 00:51:37.310305 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2026-04-07 00:51:37.310317 | orchestrator | Tuesday 07 April 2026 00:46:31 +0000 (0:00:01.065) 0:00:16.225 ********* 2026-04-07 00:51:37.310324 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2026-04-07 00:51:37.310330 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2026-04-07 00:51:37.310337 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2026-04-07 00:51:37.310390 | orchestrator | 2026-04-07 00:51:37.310398 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2026-04-07 00:51:37.310404 | orchestrator | Tuesday 07 April 2026 00:46:33 +0000 (0:00:02.328) 0:00:18.553 ********* 2026-04-07 00:51:37.310410 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2026-04-07 00:51:37.310416 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.310423 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2026-04-07 00:51:37.310429 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.310435 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2026-04-07 00:51:37.310441 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.310447 | orchestrator | 2026-04-07 00:51:37.310454 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2026-04-07 00:51:37.310460 | orchestrator | Tuesday 07 April 2026 00:46:34 +0000 (0:00:00.696) 0:00:19.249 ********* 2026-04-07 00:51:37.310475 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.310488 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.310495 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.310502 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.310509 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.310561 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.310578 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.310586 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.310592 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.310599 | orchestrator | 2026-04-07 00:51:37.310605 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2026-04-07 00:51:37.310612 | orchestrator | Tuesday 07 April 2026 00:46:36 +0000 (0:00:02.330) 0:00:21.580 ********* 2026-04-07 00:51:37.310618 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.310625 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.310631 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.310637 | orchestrator | 2026-04-07 00:51:37.310644 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2026-04-07 00:51:37.310650 | orchestrator | Tuesday 07 April 2026 00:46:38 +0000 (0:00:01.344) 0:00:22.924 ********* 2026-04-07 00:51:37.310656 | orchestrator | changed: [testbed-node-0] => (item=users) 2026-04-07 00:51:37.310662 | orchestrator | changed: [testbed-node-1] => (item=users) 2026-04-07 00:51:37.310668 | orchestrator | changed: [testbed-node-2] => (item=users) 2026-04-07 00:51:37.310675 | orchestrator | changed: [testbed-node-0] => (item=rules) 2026-04-07 00:51:37.310696 | orchestrator | changed: [testbed-node-1] => (item=rules) 2026-04-07 00:51:37.310702 | orchestrator | changed: [testbed-node-2] => (item=rules) 2026-04-07 00:51:37.310709 | orchestrator | 2026-04-07 00:51:37.310723 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2026-04-07 00:51:37.310729 | orchestrator | Tuesday 07 April 2026 00:46:39 +0000 (0:00:01.826) 0:00:24.750 ********* 2026-04-07 00:51:37.310736 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.310742 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.310748 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.310754 | orchestrator | 2026-04-07 00:51:37.310761 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2026-04-07 00:51:37.310767 | orchestrator | Tuesday 07 April 2026 00:46:40 +0000 (0:00:00.893) 0:00:25.644 ********* 2026-04-07 00:51:37.310774 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.310780 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.310842 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.310849 | orchestrator | 2026-04-07 00:51:37.310855 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2026-04-07 00:51:37.310862 | orchestrator | Tuesday 07 April 2026 00:46:42 +0000 (0:00:01.245) 0:00:26.889 ********* 2026-04-07 00:51:37.310874 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.310889 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.310900 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.310908 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-07 00:51:37.310915 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.310922 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.310929 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.310935 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.310950 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-07 00:51:37.310956 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.310966 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.310973 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.310980 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.310987 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-07 00:51:37.310994 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.311000 | orchestrator | 2026-04-07 00:51:37.311007 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2026-04-07 00:51:37.311137 | orchestrator | Tuesday 07 April 2026 00:46:43 +0000 (0:00:01.686) 0:00:28.576 ********* 2026-04-07 00:51:37.311147 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311160 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311170 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311177 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311184 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.311191 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311198 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-07 00:51:37.311215 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.311222 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-07 00:51:37.311232 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311239 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.311246 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5', '__omit_place_holder__b737f7ad23ac7e3d815b2d61c494fe16718611c5'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-07 00:51:37.311252 | orchestrator | 2026-04-07 00:51:37.311259 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2026-04-07 00:51:37.311270 | orchestrator | Tuesday 07 April 2026 00:46:47 +0000 (0:00:03.578) 0:00:32.154 ********* 2026-04-07 00:51:37.311277 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311528 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311543 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311555 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311597 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311604 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.311619 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.311627 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.311637 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.311645 | orchestrator | 2026-04-07 00:51:37.311671 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2026-04-07 00:51:37.311678 | orchestrator | Tuesday 07 April 2026 00:46:50 +0000 (0:00:03.298) 0:00:35.453 ********* 2026-04-07 00:51:37.311686 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-04-07 00:51:37.311693 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-04-07 00:51:37.311699 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-04-07 00:51:37.311706 | orchestrator | 2026-04-07 00:51:37.311712 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2026-04-07 00:51:37.311719 | orchestrator | Tuesday 07 April 2026 00:46:52 +0000 (0:00:01.970) 0:00:37.424 ********* 2026-04-07 00:51:37.311729 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-04-07 00:51:37.311737 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-04-07 00:51:37.311744 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-04-07 00:51:37.311750 | orchestrator | 2026-04-07 00:51:37.311778 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2026-04-07 00:51:37.311785 | orchestrator | Tuesday 07 April 2026 00:46:55 +0000 (0:00:03.379) 0:00:40.804 ********* 2026-04-07 00:51:37.311791 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.311798 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.311804 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.311810 | orchestrator | 2026-04-07 00:51:37.311817 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2026-04-07 00:51:37.311823 | orchestrator | Tuesday 07 April 2026 00:46:56 +0000 (0:00:00.743) 0:00:41.547 ********* 2026-04-07 00:51:37.311830 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-04-07 00:51:37.311837 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-04-07 00:51:37.311867 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-04-07 00:51:37.311875 | orchestrator | 2026-04-07 00:51:37.311881 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2026-04-07 00:51:37.311888 | orchestrator | Tuesday 07 April 2026 00:46:59 +0000 (0:00:02.327) 0:00:43.875 ********* 2026-04-07 00:51:37.311895 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-04-07 00:51:37.311902 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-04-07 00:51:37.311908 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-04-07 00:51:37.311915 | orchestrator | 2026-04-07 00:51:37.311921 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2026-04-07 00:51:37.311928 | orchestrator | Tuesday 07 April 2026 00:47:00 +0000 (0:00:01.960) 0:00:45.835 ********* 2026-04-07 00:51:37.311934 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.311940 | orchestrator | 2026-04-07 00:51:37.311947 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2026-04-07 00:51:37.311953 | orchestrator | Tuesday 07 April 2026 00:47:01 +0000 (0:00:00.705) 0:00:46.541 ********* 2026-04-07 00:51:37.311982 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2026-04-07 00:51:37.311989 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2026-04-07 00:51:37.311995 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2026-04-07 00:51:37.312002 | orchestrator | 2026-04-07 00:51:37.312009 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2026-04-07 00:51:37.312015 | orchestrator | Tuesday 07 April 2026 00:47:03 +0000 (0:00:02.136) 0:00:48.677 ********* 2026-04-07 00:51:37.312022 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2026-04-07 00:51:37.312029 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2026-04-07 00:51:37.312035 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2026-04-07 00:51:37.312042 | orchestrator | 2026-04-07 00:51:37.312048 | orchestrator | TASK [loadbalancer : Copying over proxysql-cert.pem] *************************** 2026-04-07 00:51:37.312055 | orchestrator | Tuesday 07 April 2026 00:47:05 +0000 (0:00:01.636) 0:00:50.314 ********* 2026-04-07 00:51:37.312062 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.312068 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.312075 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.312082 | orchestrator | 2026-04-07 00:51:37.312089 | orchestrator | TASK [loadbalancer : Copying over proxysql-key.pem] **************************** 2026-04-07 00:51:37.312095 | orchestrator | Tuesday 07 April 2026 00:47:05 +0000 (0:00:00.354) 0:00:50.669 ********* 2026-04-07 00:51:37.312101 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.312108 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.312141 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.312149 | orchestrator | 2026-04-07 00:51:37.312161 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2026-04-07 00:51:37.312169 | orchestrator | Tuesday 07 April 2026 00:47:06 +0000 (0:00:00.308) 0:00:50.977 ********* 2026-04-07 00:51:37.312177 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312194 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312202 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312209 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312217 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312225 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312238 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.312250 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.312264 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.312272 | orchestrator | 2026-04-07 00:51:37.312279 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2026-04-07 00:51:37.312286 | orchestrator | Tuesday 07 April 2026 00:47:09 +0000 (0:00:03.658) 0:00:54.636 ********* 2026-04-07 00:51:37.312293 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.312300 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.312307 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.312314 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.312325 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.312332 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.312365 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.312372 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.312379 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.312386 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.312392 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.312471 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.312481 | orchestrator | 2026-04-07 00:51:37.312487 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2026-04-07 00:51:37.312494 | orchestrator | Tuesday 07 April 2026 00:47:10 +0000 (0:00:00.536) 0:00:55.172 ********* 2026-04-07 00:51:37.312501 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.312512 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.312532 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.312541 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.312579 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.312586 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.312593 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.312599 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.312606 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.312619 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.312631 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.312638 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.312644 | orchestrator | 2026-04-07 00:51:37.312651 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2026-04-07 00:51:37.312661 | orchestrator | Tuesday 07 April 2026 00:47:11 +0000 (0:00:00.876) 0:00:56.049 ********* 2026-04-07 00:51:37.312668 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-04-07 00:51:37.312674 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-04-07 00:51:37.312681 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-04-07 00:51:37.312687 | orchestrator | 2026-04-07 00:51:37.312694 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2026-04-07 00:51:37.312700 | orchestrator | Tuesday 07 April 2026 00:47:12 +0000 (0:00:01.805) 0:00:57.855 ********* 2026-04-07 00:51:37.312707 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-04-07 00:51:37.312714 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-04-07 00:51:37.312721 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-04-07 00:51:37.312727 | orchestrator | 2026-04-07 00:51:37.312734 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2026-04-07 00:51:37.312740 | orchestrator | Tuesday 07 April 2026 00:47:14 +0000 (0:00:01.783) 0:00:59.639 ********* 2026-04-07 00:51:37.312747 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-04-07 00:51:37.312753 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-04-07 00:51:37.312760 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-04-07 00:51:37.312767 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.312774 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-04-07 00:51:37.312780 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-04-07 00:51:37.312787 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.312794 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-04-07 00:51:37.312801 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.312808 | orchestrator | 2026-04-07 00:51:37.312815 | orchestrator | TASK [service-check-containers : loadbalancer | Check containers] ************** 2026-04-07 00:51:37.312822 | orchestrator | Tuesday 07 April 2026 00:47:15 +0000 (0:00:00.922) 0:01:00.561 ********* 2026-04-07 00:51:37.312829 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312844 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312851 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312862 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312870 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312876 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.312883 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.312895 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.313534 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.313566 | orchestrator | 2026-04-07 00:51:37.313573 | orchestrator | TASK [service-check-containers : loadbalancer | Notify handlers to restart containers] *** 2026-04-07 00:51:37.313581 | orchestrator | Tuesday 07 April 2026 00:47:18 +0000 (0:00:02.704) 0:01:03.265 ********* 2026-04-07 00:51:37.313588 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:51:37.313596 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:51:37.313603 | orchestrator | } 2026-04-07 00:51:37.313609 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:51:37.313615 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:51:37.313622 | orchestrator | } 2026-04-07 00:51:37.313628 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:51:37.313634 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:51:37.313641 | orchestrator | } 2026-04-07 00:51:37.313647 | orchestrator | 2026-04-07 00:51:37.313654 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:51:37.313660 | orchestrator | Tuesday 07 April 2026 00:47:18 +0000 (0:00:00.317) 0:01:03.583 ********* 2026-04-07 00:51:37.313673 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.313682 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.313689 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.313728 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.313735 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.313742 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.313756 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.313763 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.313774 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.313780 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.313787 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.313877 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.313884 | orchestrator | 2026-04-07 00:51:37.313891 | orchestrator | TASK [include_role : aodh] ***************************************************** 2026-04-07 00:51:37.313897 | orchestrator | Tuesday 07 April 2026 00:47:20 +0000 (0:00:01.406) 0:01:04.989 ********* 2026-04-07 00:51:37.313904 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.313911 | orchestrator | 2026-04-07 00:51:37.313918 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2026-04-07 00:51:37.313924 | orchestrator | Tuesday 07 April 2026 00:47:20 +0000 (0:00:00.730) 0:01:05.720 ********* 2026-04-07 00:51:37.313932 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.313947 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.313956 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.313967 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.313974 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.314106 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.314118 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314130 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314140 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.314147 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.314160 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314167 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314174 | orchestrator | 2026-04-07 00:51:37.314181 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2026-04-07 00:51:37.314188 | orchestrator | Tuesday 07 April 2026 00:47:23 +0000 (0:00:03.066) 0:01:08.787 ********* 2026-04-07 00:51:37.314195 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.314208 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.314222 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314234 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314241 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.314248 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.314256 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.314263 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314272 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314302 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.314338 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.314387 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.314395 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314428 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314434 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.314441 | orchestrator | 2026-04-07 00:51:37.314447 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2026-04-07 00:51:37.314454 | orchestrator | Tuesday 07 April 2026 00:47:24 +0000 (0:00:00.620) 0:01:09.407 ********* 2026-04-07 00:51:37.314463 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.314474 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.314483 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.314494 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.314502 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.314508 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.314516 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.314532 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.314539 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.314545 | orchestrator | 2026-04-07 00:51:37.314552 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2026-04-07 00:51:37.314559 | orchestrator | Tuesday 07 April 2026 00:47:25 +0000 (0:00:01.003) 0:01:10.410 ********* 2026-04-07 00:51:37.314566 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.314572 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.314578 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.314585 | orchestrator | 2026-04-07 00:51:37.314591 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2026-04-07 00:51:37.314598 | orchestrator | Tuesday 07 April 2026 00:47:26 +0000 (0:00:01.187) 0:01:11.598 ********* 2026-04-07 00:51:37.314604 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.314611 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.314617 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.314624 | orchestrator | 2026-04-07 00:51:37.314631 | orchestrator | TASK [include_role : barbican] ************************************************* 2026-04-07 00:51:37.314638 | orchestrator | Tuesday 07 April 2026 00:47:28 +0000 (0:00:01.993) 0:01:13.592 ********* 2026-04-07 00:51:37.314644 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.314651 | orchestrator | 2026-04-07 00:51:37.314657 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2026-04-07 00:51:37.314664 | orchestrator | Tuesday 07 April 2026 00:47:29 +0000 (0:00:00.592) 0:01:14.185 ********* 2026-04-07 00:51:37.314671 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.314679 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314690 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314706 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.314734 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314741 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314748 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.314760 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314771 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314778 | orchestrator | 2026-04-07 00:51:37.314788 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2026-04-07 00:51:37.314795 | orchestrator | Tuesday 07 April 2026 00:47:33 +0000 (0:00:04.199) 0:01:18.384 ********* 2026-04-07 00:51:37.314802 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.314809 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314815 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314822 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.314840 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.314941 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314948 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314955 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.314962 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.314969 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314980 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.314991 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.314997 | orchestrator | 2026-04-07 00:51:37.315004 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2026-04-07 00:51:37.315011 | orchestrator | Tuesday 07 April 2026 00:47:34 +0000 (0:00:00.885) 0:01:19.270 ********* 2026-04-07 00:51:37.315018 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315025 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315033 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315043 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315050 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315056 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315063 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315070 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315077 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315083 | orchestrator | 2026-04-07 00:51:37.315090 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2026-04-07 00:51:37.315097 | orchestrator | Tuesday 07 April 2026 00:47:35 +0000 (0:00:00.785) 0:01:20.056 ********* 2026-04-07 00:51:37.315103 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.315110 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.315117 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.315123 | orchestrator | 2026-04-07 00:51:37.315130 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2026-04-07 00:51:37.315136 | orchestrator | Tuesday 07 April 2026 00:47:36 +0000 (0:00:01.099) 0:01:21.155 ********* 2026-04-07 00:51:37.315142 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.315148 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.315154 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.315160 | orchestrator | 2026-04-07 00:51:37.315167 | orchestrator | TASK [include_role : blazar] *************************************************** 2026-04-07 00:51:37.315173 | orchestrator | Tuesday 07 April 2026 00:47:38 +0000 (0:00:01.910) 0:01:23.065 ********* 2026-04-07 00:51:37.315185 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315191 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315198 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315204 | orchestrator | 2026-04-07 00:51:37.315211 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2026-04-07 00:51:37.315217 | orchestrator | Tuesday 07 April 2026 00:47:38 +0000 (0:00:00.278) 0:01:23.344 ********* 2026-04-07 00:51:37.315224 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.315230 | orchestrator | 2026-04-07 00:51:37.315237 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2026-04-07 00:51:37.315244 | orchestrator | Tuesday 07 April 2026 00:47:39 +0000 (0:00:00.828) 0:01:24.173 ********* 2026-04-07 00:51:37.315253 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-04-07 00:51:37.315264 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-04-07 00:51:37.315279 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-04-07 00:51:37.315285 | orchestrator | 2026-04-07 00:51:37.315292 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2026-04-07 00:51:37.315299 | orchestrator | Tuesday 07 April 2026 00:47:42 +0000 (0:00:02.945) 0:01:27.118 ********* 2026-04-07 00:51:37.315306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-04-07 00:51:37.315317 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315324 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-04-07 00:51:37.315331 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315362 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-04-07 00:51:37.315369 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315375 | orchestrator | 2026-04-07 00:51:37.315380 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2026-04-07 00:51:37.315386 | orchestrator | Tuesday 07 April 2026 00:47:44 +0000 (0:00:02.101) 0:01:29.220 ********* 2026-04-07 00:51:37.315392 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-07 00:51:37.315405 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-07 00:51:37.315412 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315418 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-07 00:51:37.315425 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-07 00:51:37.315436 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315443 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-07 00:51:37.315449 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-07 00:51:37.315456 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315462 | orchestrator | 2026-04-07 00:51:37.315468 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2026-04-07 00:51:37.315474 | orchestrator | Tuesday 07 April 2026 00:47:46 +0000 (0:00:02.545) 0:01:31.766 ********* 2026-04-07 00:51:37.315481 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315487 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315493 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315499 | orchestrator | 2026-04-07 00:51:37.315535 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2026-04-07 00:51:37.315542 | orchestrator | Tuesday 07 April 2026 00:47:47 +0000 (0:00:00.408) 0:01:32.174 ********* 2026-04-07 00:51:37.315548 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315555 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315561 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315567 | orchestrator | 2026-04-07 00:51:37.315574 | orchestrator | TASK [include_role : cinder] *************************************************** 2026-04-07 00:51:37.315580 | orchestrator | Tuesday 07 April 2026 00:47:48 +0000 (0:00:01.034) 0:01:33.209 ********* 2026-04-07 00:51:37.315586 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.315593 | orchestrator | 2026-04-07 00:51:37.315599 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2026-04-07 00:51:37.315605 | orchestrator | Tuesday 07 April 2026 00:47:49 +0000 (0:00:00.778) 0:01:33.988 ********* 2026-04-07 00:51:37.315656 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.315666 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315678 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315687 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315695 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.315708 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315719 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.315732 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315739 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315745 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315758 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315768 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315779 | orchestrator | 2026-04-07 00:51:37.315786 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2026-04-07 00:51:37.315793 | orchestrator | Tuesday 07 April 2026 00:47:52 +0000 (0:00:03.180) 0:01:37.169 ********* 2026-04-07 00:51:37.315800 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.315807 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315814 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315825 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315832 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315842 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.315868 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315874 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315881 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315888 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.315899 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.315914 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315921 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315928 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.315934 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.315940 | orchestrator | 2026-04-07 00:51:37.315947 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2026-04-07 00:51:37.315954 | orchestrator | Tuesday 07 April 2026 00:47:53 +0000 (0:00:00.836) 0:01:38.006 ********* 2026-04-07 00:51:37.315961 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315970 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315977 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.315983 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315990 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.315997 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.316004 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.316020 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.316027 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.316034 | orchestrator | 2026-04-07 00:51:37.316040 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2026-04-07 00:51:37.316047 | orchestrator | Tuesday 07 April 2026 00:47:54 +0000 (0:00:01.253) 0:01:39.259 ********* 2026-04-07 00:51:37.316054 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.316061 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.316068 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.316074 | orchestrator | 2026-04-07 00:51:37.316081 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2026-04-07 00:51:37.316088 | orchestrator | Tuesday 07 April 2026 00:47:55 +0000 (0:00:01.242) 0:01:40.502 ********* 2026-04-07 00:51:37.316094 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.316101 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.316108 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.316114 | orchestrator | 2026-04-07 00:51:37.316121 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2026-04-07 00:51:37.316134 | orchestrator | Tuesday 07 April 2026 00:47:57 +0000 (0:00:01.962) 0:01:42.465 ********* 2026-04-07 00:51:37.316141 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.316147 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.316154 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.316160 | orchestrator | 2026-04-07 00:51:37.316167 | orchestrator | TASK [include_role : cyborg] *************************************************** 2026-04-07 00:51:37.316173 | orchestrator | Tuesday 07 April 2026 00:47:57 +0000 (0:00:00.301) 0:01:42.766 ********* 2026-04-07 00:51:37.316180 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.316187 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.316193 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.316200 | orchestrator | 2026-04-07 00:51:37.316206 | orchestrator | TASK [include_role : designate] ************************************************ 2026-04-07 00:51:37.316213 | orchestrator | Tuesday 07 April 2026 00:47:58 +0000 (0:00:00.276) 0:01:43.042 ********* 2026-04-07 00:51:37.316219 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.316226 | orchestrator | 2026-04-07 00:51:37.316233 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2026-04-07 00:51:37.316239 | orchestrator | Tuesday 07 April 2026 00:47:59 +0000 (0:00:00.885) 0:01:43.928 ********* 2026-04-07 00:51:37.316247 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.316254 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-07 00:51:37.316268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316892 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316922 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316929 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316937 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316944 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.316965 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-07 00:51:37.316973 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316982 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.316988 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.316995 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317006 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317015 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-07 00:51:37.317022 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317031 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317038 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317045 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317058 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317064 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317071 | orchestrator | 2026-04-07 00:51:37.317077 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2026-04-07 00:51:37.317084 | orchestrator | Tuesday 07 April 2026 00:48:03 +0000 (0:00:04.374) 0:01:48.302 ********* 2026-04-07 00:51:37.317094 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.317104 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-07 00:51:37.317111 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317122 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317129 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317138 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317144 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317150 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.317197 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.317243 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-07 00:51:37.317255 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317261 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317271 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317278 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317288 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317294 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.317301 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.317312 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-07 00:51:37.317319 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317330 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317336 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317392 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317401 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.317416 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.317422 | orchestrator | 2026-04-07 00:51:37.317429 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2026-04-07 00:51:37.317435 | orchestrator | Tuesday 07 April 2026 00:48:04 +0000 (0:00:00.954) 0:01:49.256 ********* 2026-04-07 00:51:37.317441 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.317450 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.317458 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.317464 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.317474 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.317480 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.317488 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.317494 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.317501 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.317508 | orchestrator | 2026-04-07 00:51:37.317515 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2026-04-07 00:51:37.317521 | orchestrator | Tuesday 07 April 2026 00:48:05 +0000 (0:00:00.940) 0:01:50.197 ********* 2026-04-07 00:51:37.317528 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.317538 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.317545 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.317552 | orchestrator | 2026-04-07 00:51:37.317559 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2026-04-07 00:51:37.317566 | orchestrator | Tuesday 07 April 2026 00:48:06 +0000 (0:00:01.291) 0:01:51.489 ********* 2026-04-07 00:51:37.317573 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.317580 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.317587 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.317593 | orchestrator | 2026-04-07 00:51:37.317600 | orchestrator | TASK [include_role : etcd] ***************************************************** 2026-04-07 00:51:37.317606 | orchestrator | Tuesday 07 April 2026 00:48:08 +0000 (0:00:01.797) 0:01:53.286 ********* 2026-04-07 00:51:37.317613 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.317619 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.317626 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.317632 | orchestrator | 2026-04-07 00:51:37.317639 | orchestrator | TASK [include_role : glance] *************************************************** 2026-04-07 00:51:37.317651 | orchestrator | Tuesday 07 April 2026 00:48:08 +0000 (0:00:00.244) 0:01:53.530 ********* 2026-04-07 00:51:37.317658 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.317665 | orchestrator | 2026-04-07 00:51:37.317671 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2026-04-07 00:51:37.317678 | orchestrator | Tuesday 07 April 2026 00:48:09 +0000 (0:00:00.675) 0:01:54.206 ********* 2026-04-07 00:51:37.317687 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-04-07 00:51:37.317761 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.317780 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-04-07 00:51:37.317792 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.317803 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-04-07 00:51:37.317817 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.317824 | orchestrator | 2026-04-07 00:51:37.317831 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2026-04-07 00:51:37.317841 | orchestrator | Tuesday 07 April 2026 00:48:14 +0000 (0:00:04.874) 0:01:59.081 ********* 2026-04-07 00:51:37.317851 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-04-07 00:51:37.317864 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.317872 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.317887 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-04-07 00:51:37.317899 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.317907 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.317977 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-04-07 00:51:37.317992 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.317999 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318005 | orchestrator | 2026-04-07 00:51:37.318052 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2026-04-07 00:51:37.318060 | orchestrator | Tuesday 07 April 2026 00:48:17 +0000 (0:00:03.148) 0:02:02.229 ********* 2026-04-07 00:51:37.318067 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-07 00:51:37.318079 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-07 00:51:37.318091 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318097 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-07 00:51:37.318107 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-07 00:51:37.318115 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.318121 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-07 00:51:37.318128 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-07 00:51:37.318134 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318140 | orchestrator | 2026-04-07 00:51:37.318146 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2026-04-07 00:51:37.318153 | orchestrator | Tuesday 07 April 2026 00:48:20 +0000 (0:00:03.357) 0:02:05.587 ********* 2026-04-07 00:51:37.318159 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.318166 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.318190 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.318196 | orchestrator | 2026-04-07 00:51:37.318203 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2026-04-07 00:51:37.318209 | orchestrator | Tuesday 07 April 2026 00:48:22 +0000 (0:00:01.400) 0:02:06.987 ********* 2026-04-07 00:51:37.318215 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.318221 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.318227 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.318234 | orchestrator | 2026-04-07 00:51:37.318240 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2026-04-07 00:51:37.318246 | orchestrator | Tuesday 07 April 2026 00:48:23 +0000 (0:00:01.853) 0:02:08.841 ********* 2026-04-07 00:51:37.318252 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318258 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.318272 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318278 | orchestrator | 2026-04-07 00:51:37.318284 | orchestrator | TASK [include_role : grafana] ************************************************** 2026-04-07 00:51:37.318290 | orchestrator | Tuesday 07 April 2026 00:48:24 +0000 (0:00:00.288) 0:02:09.130 ********* 2026-04-07 00:51:37.318296 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.318303 | orchestrator | 2026-04-07 00:51:37.318309 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2026-04-07 00:51:37.318316 | orchestrator | Tuesday 07 April 2026 00:48:25 +0000 (0:00:00.795) 0:02:09.925 ********* 2026-04-07 00:51:37.318328 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.318339 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.318362 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.318368 | orchestrator | 2026-04-07 00:51:37.318375 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2026-04-07 00:51:37.318381 | orchestrator | Tuesday 07 April 2026 00:48:28 +0000 (0:00:02.957) 0:02:12.883 ********* 2026-04-07 00:51:37.318387 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.318399 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.318406 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318412 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.318422 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.318428 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318434 | orchestrator | 2026-04-07 00:51:37.318439 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2026-04-07 00:51:37.318471 | orchestrator | Tuesday 07 April 2026 00:48:28 +0000 (0:00:00.378) 0:02:13.261 ********* 2026-04-07 00:51:37.318477 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.318488 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.318495 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318501 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.318506 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.318512 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.318518 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.318523 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.318529 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318535 | orchestrator | 2026-04-07 00:51:37.318541 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2026-04-07 00:51:37.318553 | orchestrator | Tuesday 07 April 2026 00:48:29 +0000 (0:00:00.638) 0:02:13.899 ********* 2026-04-07 00:51:37.318560 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.318566 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.318591 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.318597 | orchestrator | 2026-04-07 00:51:37.318603 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2026-04-07 00:51:37.318609 | orchestrator | Tuesday 07 April 2026 00:48:30 +0000 (0:00:01.188) 0:02:15.088 ********* 2026-04-07 00:51:37.318615 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.318657 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.318664 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.318670 | orchestrator | 2026-04-07 00:51:37.318676 | orchestrator | TASK [include_role : heat] ***************************************************** 2026-04-07 00:51:37.318682 | orchestrator | Tuesday 07 April 2026 00:48:31 +0000 (0:00:01.620) 0:02:16.709 ********* 2026-04-07 00:51:37.318689 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318695 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.318701 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318707 | orchestrator | 2026-04-07 00:51:37.318714 | orchestrator | TASK [include_role : horizon] ************************************************** 2026-04-07 00:51:37.318720 | orchestrator | Tuesday 07 April 2026 00:48:32 +0000 (0:00:00.482) 0:02:17.191 ********* 2026-04-07 00:51:37.318727 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.318733 | orchestrator | 2026-04-07 00:51:37.318739 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2026-04-07 00:51:37.318745 | orchestrator | Tuesday 07 April 2026 00:48:33 +0000 (0:00:00.860) 0:02:18.052 ********* 2026-04-07 00:51:37.318772 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:51:37.318780 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:51:37.318802 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:51:37.318815 | orchestrator | 2026-04-07 00:51:37.318821 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2026-04-07 00:51:37.318827 | orchestrator | Tuesday 07 April 2026 00:48:36 +0000 (0:00:03.433) 0:02:21.486 ********* 2026-04-07 00:51:37.318838 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:51:37.318845 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318855 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:51:37.318867 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.318878 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:51:37.318885 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.318892 | orchestrator | 2026-04-07 00:51:37.318898 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2026-04-07 00:51:37.318905 | orchestrator | Tuesday 07 April 2026 00:48:37 +0000 (0:00:00.942) 0:02:22.428 ********* 2026-04-07 00:51:37.318916 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-07 00:51:37.318927 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-07 00:51:37.318936 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-07 00:51:37.318944 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-07 00:51:37.318950 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-04-07 00:51:37.318958 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.318964 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-07 00:51:37.318970 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-07 00:51:37.318977 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-07 00:51:37.318984 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-07 00:51:37.318990 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-04-07 00:51:37.318998 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.319008 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-07 00:51:37.319015 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-07 00:51:37.319024 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-07 00:51:37.319035 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-07 00:51:37.319042 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-04-07 00:51:37.319048 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.319054 | orchestrator | 2026-04-07 00:51:37.319060 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2026-04-07 00:51:37.319066 | orchestrator | Tuesday 07 April 2026 00:48:38 +0000 (0:00:00.913) 0:02:23.341 ********* 2026-04-07 00:51:37.319073 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.319079 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.319085 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.319091 | orchestrator | 2026-04-07 00:51:37.319098 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2026-04-07 00:51:37.319104 | orchestrator | Tuesday 07 April 2026 00:48:39 +0000 (0:00:01.073) 0:02:24.415 ********* 2026-04-07 00:51:37.319110 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.319117 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.319123 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.319129 | orchestrator | 2026-04-07 00:51:37.319135 | orchestrator | TASK [include_role : influxdb] ************************************************* 2026-04-07 00:51:37.319140 | orchestrator | Tuesday 07 April 2026 00:48:41 +0000 (0:00:01.922) 0:02:26.337 ********* 2026-04-07 00:51:37.319146 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.319152 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.319159 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.319165 | orchestrator | 2026-04-07 00:51:37.319171 | orchestrator | TASK [include_role : ironic] *************************************************** 2026-04-07 00:51:37.319177 | orchestrator | Tuesday 07 April 2026 00:48:41 +0000 (0:00:00.462) 0:02:26.799 ********* 2026-04-07 00:51:37.319184 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.319203 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.319209 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.319215 | orchestrator | 2026-04-07 00:51:37.319222 | orchestrator | TASK [include_role : keystone] ************************************************* 2026-04-07 00:51:37.319228 | orchestrator | Tuesday 07 April 2026 00:48:42 +0000 (0:00:00.287) 0:02:27.087 ********* 2026-04-07 00:51:37.319234 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.319240 | orchestrator | 2026-04-07 00:51:37.319247 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2026-04-07 00:51:37.319253 | orchestrator | Tuesday 07 April 2026 00:48:43 +0000 (0:00:00.910) 0:02:27.998 ********* 2026-04-07 00:51:37.319264 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:51:37.319281 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:51:37.319293 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:51:37.319300 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:51:37.319307 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:51:37.319314 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:51:37.319325 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:51:37.319340 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:51:37.319362 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:51:37.319368 | orchestrator | 2026-04-07 00:51:37.319480 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2026-04-07 00:51:37.319490 | orchestrator | Tuesday 07 April 2026 00:48:46 +0000 (0:00:03.644) 0:02:31.643 ********* 2026-04-07 00:51:37.319497 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:51:37.319504 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:51:37.319522 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:51:37.319529 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.319540 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:51:37.319547 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:51:37.319554 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:51:37.319560 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.319567 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:51:37.319583 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:51:37.319593 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:51:37.319600 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.319607 | orchestrator | 2026-04-07 00:51:37.319614 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2026-04-07 00:51:37.319620 | orchestrator | Tuesday 07 April 2026 00:48:47 +0000 (0:00:00.570) 0:02:32.213 ********* 2026-04-07 00:51:37.319626 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-07 00:51:37.319633 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-07 00:51:37.319640 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.319646 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-07 00:51:37.319653 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-07 00:51:37.319659 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.319665 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-07 00:51:37.319672 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-07 00:51:37.319683 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.319690 | orchestrator | 2026-04-07 00:51:37.319696 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2026-04-07 00:51:37.319702 | orchestrator | Tuesday 07 April 2026 00:48:48 +0000 (0:00:00.817) 0:02:33.030 ********* 2026-04-07 00:51:37.319708 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.319714 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.319721 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.319727 | orchestrator | 2026-04-07 00:51:37.319733 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2026-04-07 00:51:37.319740 | orchestrator | Tuesday 07 April 2026 00:48:49 +0000 (0:00:01.333) 0:02:34.363 ********* 2026-04-07 00:51:37.319746 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.319752 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.319759 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.319766 | orchestrator | 2026-04-07 00:51:37.319772 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2026-04-07 00:51:37.319779 | orchestrator | Tuesday 07 April 2026 00:48:51 +0000 (0:00:01.738) 0:02:36.102 ********* 2026-04-07 00:51:37.319785 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.319791 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.319797 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.319803 | orchestrator | 2026-04-07 00:51:37.319809 | orchestrator | TASK [include_role : magnum] *************************************************** 2026-04-07 00:51:37.319815 | orchestrator | Tuesday 07 April 2026 00:48:51 +0000 (0:00:00.526) 0:02:36.628 ********* 2026-04-07 00:51:37.319821 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.319827 | orchestrator | 2026-04-07 00:51:37.319837 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2026-04-07 00:51:37.319844 | orchestrator | Tuesday 07 April 2026 00:48:52 +0000 (0:00:01.036) 0:02:37.665 ********* 2026-04-07 00:51:37.319855 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.319862 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.319869 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.319882 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.319893 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.319904 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.319911 | orchestrator | 2026-04-07 00:51:37.319918 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2026-04-07 00:51:37.319924 | orchestrator | Tuesday 07 April 2026 00:48:56 +0000 (0:00:04.056) 0:02:41.722 ********* 2026-04-07 00:51:37.319932 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.319945 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.319952 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.319963 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.319974 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.319981 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.320036 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.320065 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320072 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.320078 | orchestrator | 2026-04-07 00:51:37.320085 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2026-04-07 00:51:37.320091 | orchestrator | Tuesday 07 April 2026 00:48:57 +0000 (0:00:00.888) 0:02:42.610 ********* 2026-04-07 00:51:37.320098 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.320106 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.320113 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.320119 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.320129 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.320135 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.320163 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.320170 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.320176 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.320182 | orchestrator | 2026-04-07 00:51:37.320189 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2026-04-07 00:51:37.320195 | orchestrator | Tuesday 07 April 2026 00:48:58 +0000 (0:00:00.832) 0:02:43.443 ********* 2026-04-07 00:51:37.320202 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.320208 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.320214 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.320221 | orchestrator | 2026-04-07 00:51:37.320233 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2026-04-07 00:51:37.320240 | orchestrator | Tuesday 07 April 2026 00:48:59 +0000 (0:00:01.121) 0:02:44.565 ********* 2026-04-07 00:51:37.320251 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.320257 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.320263 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.320269 | orchestrator | 2026-04-07 00:51:37.320276 | orchestrator | TASK [include_role : manila] *************************************************** 2026-04-07 00:51:37.320282 | orchestrator | Tuesday 07 April 2026 00:49:01 +0000 (0:00:01.883) 0:02:46.448 ********* 2026-04-07 00:51:37.320288 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.320294 | orchestrator | 2026-04-07 00:51:37.320301 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2026-04-07 00:51:37.320307 | orchestrator | Tuesday 07 April 2026 00:49:02 +0000 (0:00:01.176) 0:02:47.625 ********* 2026-04-07 00:51:37.320315 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release//manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.320328 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release//manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320335 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release//manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.320390 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release//manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320408 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release//manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320415 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release//manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320422 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release//manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320429 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release//manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320440 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release//manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.320447 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release//manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320462 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release//manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320469 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release//manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320475 | orchestrator | 2026-04-07 00:51:37.320481 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2026-04-07 00:51:37.320488 | orchestrator | Tuesday 07 April 2026 00:49:06 +0000 (0:00:03.652) 0:02:51.277 ********* 2026-04-07 00:51:37.320495 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release//manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.320501 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release//manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320512 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release//manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320524 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release//manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320531 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.320538 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release//manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.320545 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release//manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.320576 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release//manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.321113 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release//manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.321133 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.321141 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release//manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.321160 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release//manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.321166 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release//manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.321173 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release//manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.321180 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.321186 | orchestrator | 2026-04-07 00:51:37.321193 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2026-04-07 00:51:37.321199 | orchestrator | Tuesday 07 April 2026 00:49:07 +0000 (0:00:01.033) 0:02:52.311 ********* 2026-04-07 00:51:37.321206 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.321231 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.321238 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.321245 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.321258 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.321264 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.321270 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.321387 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.321396 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.321402 | orchestrator | 2026-04-07 00:51:37.321408 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2026-04-07 00:51:37.321415 | orchestrator | Tuesday 07 April 2026 00:49:08 +0000 (0:00:01.144) 0:02:53.456 ********* 2026-04-07 00:51:37.321421 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.321427 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.321433 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.321439 | orchestrator | 2026-04-07 00:51:37.321445 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2026-04-07 00:51:37.321452 | orchestrator | Tuesday 07 April 2026 00:49:09 +0000 (0:00:01.205) 0:02:54.662 ********* 2026-04-07 00:51:37.321458 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.321469 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.321475 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.321481 | orchestrator | 2026-04-07 00:51:37.321487 | orchestrator | TASK [include_role : mariadb] ************************************************** 2026-04-07 00:51:37.321494 | orchestrator | Tuesday 07 April 2026 00:49:11 +0000 (0:00:01.963) 0:02:56.625 ********* 2026-04-07 00:51:37.321500 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.321506 | orchestrator | 2026-04-07 00:51:37.321512 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2026-04-07 00:51:37.321518 | orchestrator | Tuesday 07 April 2026 00:49:12 +0000 (0:00:00.974) 0:02:57.600 ********* 2026-04-07 00:51:37.321525 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-07 00:51:37.321532 | orchestrator | 2026-04-07 00:51:37.321538 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2026-04-07 00:51:37.321544 | orchestrator | Tuesday 07 April 2026 00:49:14 +0000 (0:00:01.631) 0:02:59.232 ********* 2026-04-07 00:51:37.321567 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:51:37.321583 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-07 00:51:37.321590 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.321600 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:51:37.321607 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-07 00:51:37.321614 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.321644 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:51:37.321657 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-07 00:51:37.321664 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.321670 | orchestrator | 2026-04-07 00:51:37.321684 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2026-04-07 00:51:37.321690 | orchestrator | Tuesday 07 April 2026 00:49:17 +0000 (0:00:02.943) 0:03:02.175 ********* 2026-04-07 00:51:37.321697 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:51:37.321723 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-07 00:51:37.321730 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.321741 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:51:37.321749 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-07 00:51:37.321755 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.321776 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:51:37.321789 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release//mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-07 00:51:37.321796 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.321803 | orchestrator | 2026-04-07 00:51:37.321810 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2026-04-07 00:51:37.321817 | orchestrator | Tuesday 07 April 2026 00:49:19 +0000 (0:00:02.003) 0:03:04.179 ********* 2026-04-07 00:51:37.321826 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-07 00:51:37.321835 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-07 00:51:37.321842 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.321850 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-07 00:51:37.321861 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-07 00:51:37.321868 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323004 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-07 00:51:37.323020 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-07 00:51:37.323056 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323064 | orchestrator | 2026-04-07 00:51:37.323071 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2026-04-07 00:51:37.323078 | orchestrator | Tuesday 07 April 2026 00:49:21 +0000 (0:00:02.205) 0:03:06.384 ********* 2026-04-07 00:51:37.323085 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.323092 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.323098 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.323106 | orchestrator | 2026-04-07 00:51:37.323113 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2026-04-07 00:51:37.323119 | orchestrator | Tuesday 07 April 2026 00:49:23 +0000 (0:00:01.860) 0:03:08.245 ********* 2026-04-07 00:51:37.323125 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323131 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323137 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323143 | orchestrator | 2026-04-07 00:51:37.323149 | orchestrator | TASK [include_role : masakari] ************************************************* 2026-04-07 00:51:37.323156 | orchestrator | Tuesday 07 April 2026 00:49:24 +0000 (0:00:01.176) 0:03:09.421 ********* 2026-04-07 00:51:37.323163 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323175 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323181 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323188 | orchestrator | 2026-04-07 00:51:37.323195 | orchestrator | TASK [include_role : memcached] ************************************************ 2026-04-07 00:51:37.323203 | orchestrator | Tuesday 07 April 2026 00:49:24 +0000 (0:00:00.248) 0:03:09.669 ********* 2026-04-07 00:51:37.323209 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.323216 | orchestrator | 2026-04-07 00:51:37.323222 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2026-04-07 00:51:37.323238 | orchestrator | Tuesday 07 April 2026 00:49:25 +0000 (0:00:01.021) 0:03:10.691 ********* 2026-04-07 00:51:37.323246 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-07 00:51:37.323254 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-07 00:51:37.323267 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-07 00:51:37.323273 | orchestrator | 2026-04-07 00:51:37.323280 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2026-04-07 00:51:37.323287 | orchestrator | Tuesday 07 April 2026 00:49:27 +0000 (0:00:01.618) 0:03:12.310 ********* 2026-04-07 00:51:37.323293 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-07 00:51:37.323300 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323309 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-07 00:51:37.323320 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323326 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release//memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-07 00:51:37.323333 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323340 | orchestrator | 2026-04-07 00:51:37.323363 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2026-04-07 00:51:37.323369 | orchestrator | Tuesday 07 April 2026 00:49:27 +0000 (0:00:00.333) 0:03:12.644 ********* 2026-04-07 00:51:37.323375 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-04-07 00:51:37.323383 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-04-07 00:51:37.323390 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323397 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323403 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-04-07 00:51:37.323409 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323415 | orchestrator | 2026-04-07 00:51:37.323489 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2026-04-07 00:51:37.323497 | orchestrator | Tuesday 07 April 2026 00:49:28 +0000 (0:00:00.529) 0:03:13.174 ********* 2026-04-07 00:51:37.323503 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323510 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323516 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323522 | orchestrator | 2026-04-07 00:51:37.323529 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2026-04-07 00:51:37.323554 | orchestrator | Tuesday 07 April 2026 00:49:28 +0000 (0:00:00.552) 0:03:13.727 ********* 2026-04-07 00:51:37.323561 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323567 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323574 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323581 | orchestrator | 2026-04-07 00:51:37.323588 | orchestrator | TASK [include_role : mistral] ************************************************** 2026-04-07 00:51:37.323594 | orchestrator | Tuesday 07 April 2026 00:49:29 +0000 (0:00:01.056) 0:03:14.783 ********* 2026-04-07 00:51:37.323601 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.323607 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.323614 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.323626 | orchestrator | 2026-04-07 00:51:37.323633 | orchestrator | TASK [include_role : neutron] ************************************************** 2026-04-07 00:51:37.323640 | orchestrator | Tuesday 07 April 2026 00:49:30 +0000 (0:00:00.248) 0:03:15.031 ********* 2026-04-07 00:51:37.323646 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.323653 | orchestrator | 2026-04-07 00:51:37.323659 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2026-04-07 00:51:37.323666 | orchestrator | Tuesday 07 April 2026 00:49:31 +0000 (0:00:01.044) 0:03:16.075 ********* 2026-04-07 00:51:37.323678 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release//neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.323687 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release//neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323695 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-07 00:51:37.323708 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-07 00:51:37.323723 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release//neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.323732 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release//neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323741 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release//neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.323748 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release//neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323760 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release//neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.323773 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release//neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.323784 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-07 00:51:37.323792 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-07 00:51:37.323799 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-07 00:51:37.323812 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release//neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323823 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release//neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323833 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.323841 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release//neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.323848 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release//neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323856 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release//neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.323868 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-07 00:51:37.323883 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release//neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-07 00:51:37.323894 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-07 00:51:37.323901 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.323916 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-07 00:51:37.323947 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.323958 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release//ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323965 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release//neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323975 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release//neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.323988 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.323995 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release//neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324006 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release//neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-07 00:51:37.324016 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release//neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324023 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release//neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324033 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324040 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release//ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-07 00:51:37.324056 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.324067 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324077 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release//neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324084 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release//neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324091 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release//neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-07 00:51:37.324098 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324139 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release//ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324147 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.324158 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release//neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324165 | orchestrator | 2026-04-07 00:51:37.324171 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2026-04-07 00:51:37.324178 | orchestrator | Tuesday 07 April 2026 00:49:35 +0000 (0:00:03.940) 0:03:20.016 ********* 2026-04-07 00:51:37.324185 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release//neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.324192 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release//neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324207 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-07 00:51:37.324216 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release//neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.324224 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-07 00:51:37.324231 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release//neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324246 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release//neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324253 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-07 00:51:37.324260 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release//neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324270 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release//neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324277 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-07 00:51:37.324288 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-07 00:51:37.324299 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release//neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release//neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324312 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324324 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release//neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324331 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-07 00:51:37.324338 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release//neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324395 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324402 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release//neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-07 00:51:37.324425 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release//neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324435 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324441 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release//neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-07 00:51:37.324448 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release//ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324460 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324471 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release//ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324478 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release//neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.324488 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.324495 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release//neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324863 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.324878 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-07 00:51:37.324889 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release//neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324896 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release//neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-07 00:51:37.324909 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release//neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324915 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.324922 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.324955 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release//neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.324963 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release//neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324970 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release//neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.324980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-07 00:51:37.324987 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.324999 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release//neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.325005 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release//neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-07 00:51:37.325046 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release//neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-07 00:51:37.325054 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release//ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.325064 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-07 00:51:37.325071 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release//neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-07 00:51:37.325083 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.325089 | orchestrator | 2026-04-07 00:51:37.325096 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2026-04-07 00:51:37.325102 | orchestrator | Tuesday 07 April 2026 00:49:36 +0000 (0:00:01.282) 0:03:21.298 ********* 2026-04-07 00:51:37.325109 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.325116 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.325122 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.325128 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.325134 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.325140 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.325146 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.325189 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.325197 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.325203 | orchestrator | 2026-04-07 00:51:37.325209 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2026-04-07 00:51:37.325215 | orchestrator | Tuesday 07 April 2026 00:49:37 +0000 (0:00:01.212) 0:03:22.511 ********* 2026-04-07 00:51:37.325222 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.325228 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.325234 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.325240 | orchestrator | 2026-04-07 00:51:37.325246 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2026-04-07 00:51:37.325252 | orchestrator | Tuesday 07 April 2026 00:49:39 +0000 (0:00:01.512) 0:03:24.023 ********* 2026-04-07 00:51:37.325258 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.325265 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.325270 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.325277 | orchestrator | 2026-04-07 00:51:37.325283 | orchestrator | TASK [include_role : placement] ************************************************ 2026-04-07 00:51:37.325289 | orchestrator | Tuesday 07 April 2026 00:49:41 +0000 (0:00:01.948) 0:03:25.972 ********* 2026-04-07 00:51:37.325295 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.325301 | orchestrator | 2026-04-07 00:51:37.325314 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2026-04-07 00:51:37.325320 | orchestrator | Tuesday 07 April 2026 00:49:42 +0000 (0:00:01.040) 0:03:27.012 ********* 2026-04-07 00:51:37.325331 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release//placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-07 00:51:37.325338 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release//placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-07 00:51:37.325424 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release//placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-07 00:51:37.325433 | orchestrator | 2026-04-07 00:51:37.325439 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2026-04-07 00:51:37.325446 | orchestrator | Tuesday 07 April 2026 00:49:45 +0000 (0:00:03.128) 0:03:30.141 ********* 2026-04-07 00:51:37.325453 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release//placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-07 00:51:37.325465 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.325537 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release//placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-07 00:51:37.325555 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.325561 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release//placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-07 00:51:37.325568 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.325574 | orchestrator | 2026-04-07 00:51:37.325581 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2026-04-07 00:51:37.325587 | orchestrator | Tuesday 07 April 2026 00:49:46 +0000 (0:00:00.802) 0:03:30.943 ********* 2026-04-07 00:51:37.325635 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.325644 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.325651 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.325658 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.325671 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.325677 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.325684 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.325690 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.325697 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.325703 | orchestrator | 2026-04-07 00:51:37.325709 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2026-04-07 00:51:37.325720 | orchestrator | Tuesday 07 April 2026 00:49:46 +0000 (0:00:00.668) 0:03:31.611 ********* 2026-04-07 00:51:37.325726 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.325733 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.325738 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.325745 | orchestrator | 2026-04-07 00:51:37.325751 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2026-04-07 00:51:37.325757 | orchestrator | Tuesday 07 April 2026 00:49:47 +0000 (0:00:01.150) 0:03:32.761 ********* 2026-04-07 00:51:37.325777 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.325784 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.325790 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.325796 | orchestrator | 2026-04-07 00:51:37.325803 | orchestrator | TASK [include_role : nova] ***************************************************** 2026-04-07 00:51:37.325809 | orchestrator | Tuesday 07 April 2026 00:49:49 +0000 (0:00:01.963) 0:03:34.725 ********* 2026-04-07 00:51:37.325815 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.325821 | orchestrator | 2026-04-07 00:51:37.325828 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2026-04-07 00:51:37.325834 | orchestrator | Tuesday 07 April 2026 00:49:51 +0000 (0:00:01.405) 0:03:36.130 ********* 2026-04-07 00:51:37.325841 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.325957 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.325980 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.325987 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.325994 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release//nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326001 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326088 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.326098 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release//nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326108 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326115 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.326122 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release//nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326184 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326193 | orchestrator | 2026-04-07 00:51:37.326199 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2026-04-07 00:51:37.326206 | orchestrator | Tuesday 07 April 2026 00:49:56 +0000 (0:00:05.138) 0:03:41.268 ********* 2026-04-07 00:51:37.326213 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.326224 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.326231 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release//nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326238 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326250 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.326294 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.326306 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.326313 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release//nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326319 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326331 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.326403 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.326412 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release//nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.326422 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release//nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326429 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release//nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.326436 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.326442 | orchestrator | 2026-04-07 00:51:37.326448 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2026-04-07 00:51:37.326455 | orchestrator | Tuesday 07 April 2026 00:49:57 +0000 (0:00:00.630) 0:03:41.899 ********* 2026-04-07 00:51:37.326472 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326485 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326492 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326499 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326506 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.326529 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326536 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326543 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326549 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326556 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.326562 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326568 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326575 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326598 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.326604 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.326611 | orchestrator | 2026-04-07 00:51:37.326617 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2026-04-07 00:51:37.326623 | orchestrator | Tuesday 07 April 2026 00:49:58 +0000 (0:00:01.309) 0:03:43.209 ********* 2026-04-07 00:51:37.326630 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.326636 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.326642 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.326649 | orchestrator | 2026-04-07 00:51:37.326655 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2026-04-07 00:51:37.326665 | orchestrator | Tuesday 07 April 2026 00:49:59 +0000 (0:00:01.318) 0:03:44.527 ********* 2026-04-07 00:51:37.326672 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.326678 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.326684 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.326690 | orchestrator | 2026-04-07 00:51:37.326697 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2026-04-07 00:51:37.326704 | orchestrator | Tuesday 07 April 2026 00:50:01 +0000 (0:00:01.899) 0:03:46.427 ********* 2026-04-07 00:51:37.326710 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.326716 | orchestrator | 2026-04-07 00:51:37.326722 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2026-04-07 00:51:37.326729 | orchestrator | Tuesday 07 April 2026 00:50:02 +0000 (0:00:01.143) 0:03:47.570 ********* 2026-04-07 00:51:37.326735 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2026-04-07 00:51:37.326743 | orchestrator | 2026-04-07 00:51:37.326749 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2026-04-07 00:51:37.326755 | orchestrator | Tuesday 07 April 2026 00:50:03 +0000 (0:00:00.944) 0:03:48.514 ********* 2026-04-07 00:51:37.326762 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-04-07 00:51:37.326786 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-04-07 00:51:37.326793 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-04-07 00:51:37.326800 | orchestrator | 2026-04-07 00:51:37.326807 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2026-04-07 00:51:37.326815 | orchestrator | Tuesday 07 April 2026 00:50:07 +0000 (0:00:03.532) 0:03:52.047 ********* 2026-04-07 00:51:37.326821 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.326828 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.326837 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.326847 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.326854 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.326860 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.326866 | orchestrator | 2026-04-07 00:51:37.326873 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2026-04-07 00:51:37.326879 | orchestrator | Tuesday 07 April 2026 00:50:08 +0000 (0:00:01.115) 0:03:53.162 ********* 2026-04-07 00:51:37.326885 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-07 00:51:37.326892 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-07 00:51:37.326898 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.326904 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-07 00:51:37.326911 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-07 00:51:37.326917 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.326923 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-07 00:51:37.326946 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-07 00:51:37.326953 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.326960 | orchestrator | 2026-04-07 00:51:37.326966 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-04-07 00:51:37.326972 | orchestrator | Tuesday 07 April 2026 00:50:09 +0000 (0:00:01.329) 0:03:54.491 ********* 2026-04-07 00:51:37.326978 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.326985 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.326992 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.326998 | orchestrator | 2026-04-07 00:51:37.327004 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-04-07 00:51:37.327010 | orchestrator | Tuesday 07 April 2026 00:50:11 +0000 (0:00:02.166) 0:03:56.657 ********* 2026-04-07 00:51:37.327017 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.327023 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.327029 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.327036 | orchestrator | 2026-04-07 00:51:37.327046 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2026-04-07 00:51:37.327052 | orchestrator | Tuesday 07 April 2026 00:50:14 +0000 (0:00:02.714) 0:03:59.372 ********* 2026-04-07 00:51:37.327059 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2026-04-07 00:51:37.327065 | orchestrator | 2026-04-07 00:51:37.327072 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2026-04-07 00:51:37.327078 | orchestrator | Tuesday 07 April 2026 00:50:15 +0000 (0:00:00.731) 0:04:00.103 ********* 2026-04-07 00:51:37.327088 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.327095 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327101 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.327108 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327115 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.327121 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.327127 | orchestrator | 2026-04-07 00:51:37.327132 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2026-04-07 00:51:37.327138 | orchestrator | Tuesday 07 April 2026 00:50:16 +0000 (0:00:01.142) 0:04:01.246 ********* 2026-04-07 00:51:37.327144 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.327150 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327176 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.327183 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327194 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-07 00:51:37.327201 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.327207 | orchestrator | 2026-04-07 00:51:37.327213 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2026-04-07 00:51:37.327220 | orchestrator | Tuesday 07 April 2026 00:50:17 +0000 (0:00:01.283) 0:04:02.530 ********* 2026-04-07 00:51:37.327226 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327232 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327238 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.327244 | orchestrator | 2026-04-07 00:51:37.327250 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-04-07 00:51:37.327257 | orchestrator | Tuesday 07 April 2026 00:50:19 +0000 (0:00:01.420) 0:04:03.950 ********* 2026-04-07 00:51:37.327263 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.327270 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.327276 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.327282 | orchestrator | 2026-04-07 00:51:37.327289 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-04-07 00:51:37.327295 | orchestrator | Tuesday 07 April 2026 00:50:21 +0000 (0:00:02.242) 0:04:06.193 ********* 2026-04-07 00:51:37.327301 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.327307 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.327313 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.327319 | orchestrator | 2026-04-07 00:51:37.327326 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2026-04-07 00:51:37.327335 | orchestrator | Tuesday 07 April 2026 00:50:24 +0000 (0:00:02.812) 0:04:09.006 ********* 2026-04-07 00:51:37.327361 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2026-04-07 00:51:37.327368 | orchestrator | 2026-04-07 00:51:37.327375 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2026-04-07 00:51:37.327381 | orchestrator | Tuesday 07 April 2026 00:50:25 +0000 (0:00:00.918) 0:04:09.924 ********* 2026-04-07 00:51:37.327388 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-07 00:51:37.327394 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327400 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-07 00:51:37.327407 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327413 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-07 00:51:37.327424 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.327430 | orchestrator | 2026-04-07 00:51:37.327436 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2026-04-07 00:51:37.327443 | orchestrator | Tuesday 07 April 2026 00:50:25 +0000 (0:00:00.942) 0:04:10.867 ********* 2026-04-07 00:51:37.327467 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-07 00:51:37.327475 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327481 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-07 00:51:37.327487 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327494 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-07 00:51:37.327500 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.327507 | orchestrator | 2026-04-07 00:51:37.327513 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2026-04-07 00:51:37.327522 | orchestrator | Tuesday 07 April 2026 00:50:27 +0000 (0:00:01.166) 0:04:12.033 ********* 2026-04-07 00:51:37.327529 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327535 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327541 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.327548 | orchestrator | 2026-04-07 00:51:37.327554 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-04-07 00:51:37.327560 | orchestrator | Tuesday 07 April 2026 00:50:28 +0000 (0:00:01.248) 0:04:13.281 ********* 2026-04-07 00:51:37.327567 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.327573 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.327579 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.327585 | orchestrator | 2026-04-07 00:51:37.327591 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-04-07 00:51:37.327598 | orchestrator | Tuesday 07 April 2026 00:50:30 +0000 (0:00:02.323) 0:04:15.604 ********* 2026-04-07 00:51:37.327604 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.327610 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.327616 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.327622 | orchestrator | 2026-04-07 00:51:37.327629 | orchestrator | TASK [include_role : octavia] ************************************************** 2026-04-07 00:51:37.327635 | orchestrator | Tuesday 07 April 2026 00:50:33 +0000 (0:00:03.153) 0:04:18.758 ********* 2026-04-07 00:51:37.327646 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.327653 | orchestrator | 2026-04-07 00:51:37.327659 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2026-04-07 00:51:37.327665 | orchestrator | Tuesday 07 April 2026 00:50:35 +0000 (0:00:01.309) 0:04:20.067 ********* 2026-04-07 00:51:37.327672 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-04-07 00:51:37.327696 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-07 00:51:37.327703 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327713 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327722 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.327729 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-04-07 00:51:37.327742 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-07 00:51:37.327763 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327770 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327777 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.327786 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-04-07 00:51:37.327800 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-07 00:51:37.327806 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327827 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327834 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.327841 | orchestrator | 2026-04-07 00:51:37.327848 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2026-04-07 00:51:37.327854 | orchestrator | Tuesday 07 April 2026 00:50:38 +0000 (0:00:03.455) 0:04:23.523 ********* 2026-04-07 00:51:37.327866 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-04-07 00:51:37.327876 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-07 00:51:37.327883 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327890 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327915 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.327922 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.327928 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-04-07 00:51:37.327938 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-07 00:51:37.327950 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327957 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.327963 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.327970 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.327992 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-04-07 00:51:37.327999 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-07 00:51:37.328008 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.328019 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-07 00:51:37.328026 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-07 00:51:37.328032 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.328038 | orchestrator | 2026-04-07 00:51:37.328045 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2026-04-07 00:51:37.328051 | orchestrator | Tuesday 07 April 2026 00:50:39 +0000 (0:00:00.857) 0:04:24.381 ********* 2026-04-07 00:51:37.328058 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-07 00:51:37.328066 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-07 00:51:37.328072 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.328079 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-07 00:51:37.328101 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-07 00:51:37.328107 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.328114 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-07 00:51:37.328120 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-07 00:51:37.328126 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.328131 | orchestrator | 2026-04-07 00:51:37.328137 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2026-04-07 00:51:37.328143 | orchestrator | Tuesday 07 April 2026 00:50:40 +0000 (0:00:00.804) 0:04:25.185 ********* 2026-04-07 00:51:37.328149 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.328159 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.328166 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.328172 | orchestrator | 2026-04-07 00:51:37.328178 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2026-04-07 00:51:37.328184 | orchestrator | Tuesday 07 April 2026 00:50:41 +0000 (0:00:01.424) 0:04:26.610 ********* 2026-04-07 00:51:37.328190 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.328196 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.328203 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.328209 | orchestrator | 2026-04-07 00:51:37.328215 | orchestrator | TASK [include_role : opensearch] *********************************************** 2026-04-07 00:51:37.328221 | orchestrator | Tuesday 07 April 2026 00:50:43 +0000 (0:00:02.095) 0:04:28.705 ********* 2026-04-07 00:51:37.328227 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.328233 | orchestrator | 2026-04-07 00:51:37.328239 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2026-04-07 00:51:37.328246 | orchestrator | Tuesday 07 April 2026 00:50:45 +0000 (0:00:01.417) 0:04:30.123 ********* 2026-04-07 00:51:37.328255 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.328262 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.328286 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.328294 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:51:37.328309 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:51:37.328317 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:51:37.328323 | orchestrator | 2026-04-07 00:51:37.328330 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2026-04-07 00:51:37.328336 | orchestrator | Tuesday 07 April 2026 00:50:49 +0000 (0:00:04.396) 0:04:34.520 ********* 2026-04-07 00:51:37.328373 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.328389 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:51:37.328396 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.328403 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.328410 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:51:37.328416 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.328444 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.328455 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:51:37.328461 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.328468 | orchestrator | 2026-04-07 00:51:37.328474 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2026-04-07 00:51:37.328480 | orchestrator | Tuesday 07 April 2026 00:50:50 +0000 (0:00:00.544) 0:04:35.064 ********* 2026-04-07 00:51:37.328487 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.328494 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-07 00:51:37.328500 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-07 00:51:37.328507 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.328513 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.328520 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-07 00:51:37.328526 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-07 00:51:37.328537 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.328559 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.328566 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-07 00:51:37.328573 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-07 00:51:37.328579 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.328585 | orchestrator | 2026-04-07 00:51:37.328592 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2026-04-07 00:51:37.328598 | orchestrator | Tuesday 07 April 2026 00:50:51 +0000 (0:00:01.053) 0:04:36.118 ********* 2026-04-07 00:51:37.328605 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.328611 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.328617 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.328624 | orchestrator | 2026-04-07 00:51:37.328630 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2026-04-07 00:51:37.328636 | orchestrator | Tuesday 07 April 2026 00:50:51 +0000 (0:00:00.411) 0:04:36.529 ********* 2026-04-07 00:51:37.328643 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.328649 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.328655 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.328662 | orchestrator | 2026-04-07 00:51:37.328668 | orchestrator | TASK [include_role : prometheus] *********************************************** 2026-04-07 00:51:37.328674 | orchestrator | Tuesday 07 April 2026 00:50:52 +0000 (0:00:01.109) 0:04:37.639 ********* 2026-04-07 00:51:37.328681 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.328687 | orchestrator | 2026-04-07 00:51:37.328699 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2026-04-07 00:51:37.328706 | orchestrator | Tuesday 07 April 2026 00:50:54 +0000 (0:00:01.430) 0:04:39.069 ********* 2026-04-07 00:51:37.328713 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 00:51:37.328724 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 00:51:37.328731 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328753 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328761 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 00:51:37.328771 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.328778 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 00:51:37.328784 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328810 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 00:51:37.328818 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328825 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 00:51:37.328834 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.328841 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328848 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328859 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.328881 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.328888 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release//prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-07 00:51:37.328898 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328905 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328912 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.328923 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.328945 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:51:37.328956 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release//prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-07 00:51:37.328962 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release//prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-07 00:51:37.328973 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.328980 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329001 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329008 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329015 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329024 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329031 | orchestrator | 2026-04-07 00:51:37.329037 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2026-04-07 00:51:37.329044 | orchestrator | Tuesday 07 April 2026 00:50:58 +0000 (0:00:03.835) 0:04:42.905 ********* 2026-04-07 00:51:37.329055 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-07 00:51:37.329062 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 00:51:37.329085 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329092 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329099 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329108 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.329132 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release//prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-07 00:51:37.329140 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329162 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329170 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-07 00:51:37.329180 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329198 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329205 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 00:51:37.329211 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329218 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329225 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329247 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-07 00:51:37.329257 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.329268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 00:51:37.329275 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release//prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-07 00:51:37.329281 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329292 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329298 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329305 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329316 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329323 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329423 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329442 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:51:37.329456 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release//prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-07 00:51:37.329463 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329470 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 00:51:37.329486 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 00:51:37.329493 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329500 | orchestrator | 2026-04-07 00:51:37.329506 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2026-04-07 00:51:37.329513 | orchestrator | Tuesday 07 April 2026 00:50:58 +0000 (0:00:00.875) 0:04:43.781 ********* 2026-04-07 00:51:37.329520 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-07 00:51:37.329527 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-07 00:51:37.329536 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.329543 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.329551 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329560 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-07 00:51:37.329567 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-07 00:51:37.329574 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.329585 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.329592 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329598 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-07 00:51:37.329608 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-07 00:51:37.329615 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.329622 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-07 00:51:37.329628 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329635 | orchestrator | 2026-04-07 00:51:37.329642 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2026-04-07 00:51:37.329648 | orchestrator | Tuesday 07 April 2026 00:51:00 +0000 (0:00:01.280) 0:04:45.062 ********* 2026-04-07 00:51:37.329655 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329661 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329668 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329674 | orchestrator | 2026-04-07 00:51:37.329681 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2026-04-07 00:51:37.329688 | orchestrator | Tuesday 07 April 2026 00:51:00 +0000 (0:00:00.446) 0:04:45.508 ********* 2026-04-07 00:51:37.329694 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329700 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329707 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329713 | orchestrator | 2026-04-07 00:51:37.329720 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2026-04-07 00:51:37.329727 | orchestrator | Tuesday 07 April 2026 00:51:01 +0000 (0:00:01.286) 0:04:46.794 ********* 2026-04-07 00:51:37.329733 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.329740 | orchestrator | 2026-04-07 00:51:37.329746 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2026-04-07 00:51:37.329752 | orchestrator | Tuesday 07 April 2026 00:51:03 +0000 (0:00:01.397) 0:04:48.192 ********* 2026-04-07 00:51:37.329763 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:51:37.329775 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:51:37.329785 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-07 00:51:37.329792 | orchestrator | 2026-04-07 00:51:37.329799 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2026-04-07 00:51:37.329805 | orchestrator | Tuesday 07 April 2026 00:51:06 +0000 (0:00:02.861) 0:04:51.054 ********* 2026-04-07 00:51:37.329812 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:51:37.329819 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329829 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:51:37.329841 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329848 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release//rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-07 00:51:37.329858 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329865 | orchestrator | 2026-04-07 00:51:37.329871 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2026-04-07 00:51:37.329878 | orchestrator | Tuesday 07 April 2026 00:51:06 +0000 (0:00:00.419) 0:04:51.474 ********* 2026-04-07 00:51:37.329884 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-04-07 00:51:37.329891 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329897 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-04-07 00:51:37.329903 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329909 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-04-07 00:51:37.329915 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329921 | orchestrator | 2026-04-07 00:51:37.329927 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2026-04-07 00:51:37.329932 | orchestrator | Tuesday 07 April 2026 00:51:07 +0000 (0:00:00.607) 0:04:52.081 ********* 2026-04-07 00:51:37.329938 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329944 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329950 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329956 | orchestrator | 2026-04-07 00:51:37.329961 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2026-04-07 00:51:37.329967 | orchestrator | Tuesday 07 April 2026 00:51:07 +0000 (0:00:00.422) 0:04:52.503 ********* 2026-04-07 00:51:37.329973 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.329979 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.329984 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.329991 | orchestrator | 2026-04-07 00:51:37.329997 | orchestrator | TASK [include_role : skyline] ************************************************** 2026-04-07 00:51:37.330010 | orchestrator | Tuesday 07 April 2026 00:51:09 +0000 (0:00:01.408) 0:04:53.912 ********* 2026-04-07 00:51:37.330063 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.330070 | orchestrator | 2026-04-07 00:51:37.330076 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2026-04-07 00:51:37.330082 | orchestrator | Tuesday 07 April 2026 00:51:10 +0000 (0:00:01.741) 0:04:55.653 ********* 2026-04-07 00:51:37.330093 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-04-07 00:51:37.330101 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-04-07 00:51:37.330113 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-04-07 00:51:37.330120 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-07 00:51:37.330135 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-07 00:51:37.330142 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-07 00:51:37.330149 | orchestrator | 2026-04-07 00:51:37.330155 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2026-04-07 00:51:37.330161 | orchestrator | Tuesday 07 April 2026 00:51:17 +0000 (0:00:06.231) 0:05:01.885 ********* 2026-04-07 00:51:37.330171 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-04-07 00:51:37.330183 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-07 00:51:37.330190 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330201 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-04-07 00:51:37.330224 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-07 00:51:37.330231 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330238 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-04-07 00:51:37.330249 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-07 00:51:37.330256 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330262 | orchestrator | 2026-04-07 00:51:37.330268 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2026-04-07 00:51:37.330275 | orchestrator | Tuesday 07 April 2026 00:51:18 +0000 (0:00:01.023) 0:05:02.908 ********* 2026-04-07 00:51:37.330285 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-07 00:51:37.330292 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-07 00:51:37.330299 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.330306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.330312 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330318 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-07 00:51:37.330327 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-07 00:51:37.330334 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.330340 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.330373 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330379 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-07 00:51:37.330385 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-07 00:51:37.330392 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.330398 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-07 00:51:37.330404 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330411 | orchestrator | 2026-04-07 00:51:37.330418 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2026-04-07 00:51:37.330424 | orchestrator | Tuesday 07 April 2026 00:51:19 +0000 (0:00:01.280) 0:05:04.188 ********* 2026-04-07 00:51:37.330430 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.330436 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.330442 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.330449 | orchestrator | 2026-04-07 00:51:37.330455 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2026-04-07 00:51:37.330461 | orchestrator | Tuesday 07 April 2026 00:51:20 +0000 (0:00:01.167) 0:05:05.356 ********* 2026-04-07 00:51:37.330467 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:51:37.330474 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:51:37.330480 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:51:37.330486 | orchestrator | 2026-04-07 00:51:37.330492 | orchestrator | TASK [include_role : tacker] *************************************************** 2026-04-07 00:51:37.330503 | orchestrator | Tuesday 07 April 2026 00:51:22 +0000 (0:00:02.081) 0:05:07.437 ********* 2026-04-07 00:51:37.330509 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330515 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330521 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330528 | orchestrator | 2026-04-07 00:51:37.330534 | orchestrator | TASK [include_role : trove] **************************************************** 2026-04-07 00:51:37.330540 | orchestrator | Tuesday 07 April 2026 00:51:22 +0000 (0:00:00.341) 0:05:07.778 ********* 2026-04-07 00:51:37.330546 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330552 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330558 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330565 | orchestrator | 2026-04-07 00:51:37.330571 | orchestrator | TASK [include_role : venus] **************************************************** 2026-04-07 00:51:37.330577 | orchestrator | Tuesday 07 April 2026 00:51:23 +0000 (0:00:00.559) 0:05:08.338 ********* 2026-04-07 00:51:37.330583 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330590 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330596 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330602 | orchestrator | 2026-04-07 00:51:37.330608 | orchestrator | TASK [include_role : watcher] ************************************************** 2026-04-07 00:51:37.330614 | orchestrator | Tuesday 07 April 2026 00:51:23 +0000 (0:00:00.300) 0:05:08.639 ********* 2026-04-07 00:51:37.330621 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330631 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330637 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330643 | orchestrator | 2026-04-07 00:51:37.330649 | orchestrator | TASK [include_role : zun] ****************************************************** 2026-04-07 00:51:37.330656 | orchestrator | Tuesday 07 April 2026 00:51:24 +0000 (0:00:00.309) 0:05:08.948 ********* 2026-04-07 00:51:37.330662 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330668 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330675 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330681 | orchestrator | 2026-04-07 00:51:37.330687 | orchestrator | TASK [include_role : loadbalancer] ********************************************* 2026-04-07 00:51:37.330693 | orchestrator | Tuesday 07 April 2026 00:51:24 +0000 (0:00:00.290) 0:05:09.238 ********* 2026-04-07 00:51:37.330700 | orchestrator | included: loadbalancer for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:51:37.330706 | orchestrator | 2026-04-07 00:51:37.330712 | orchestrator | TASK [service-check-containers : loadbalancer | Check containers] ************** 2026-04-07 00:51:37.330718 | orchestrator | Tuesday 07 April 2026 00:51:25 +0000 (0:00:01.562) 0:05:10.801 ********* 2026-04-07 00:51:37.330728 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.330736 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.330743 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-07 00:51:37.330752 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.330760 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.330771 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-07 00:51:37.330780 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.330787 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.330793 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-07 00:51:37.330799 | orchestrator | 2026-04-07 00:51:37.330806 | orchestrator | TASK [service-check-containers : loadbalancer | Notify handlers to restart containers] *** 2026-04-07 00:51:37.330812 | orchestrator | Tuesday 07 April 2026 00:51:27 +0000 (0:00:02.057) 0:05:12.858 ********* 2026-04-07 00:51:37.330818 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:51:37.330826 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:51:37.330832 | orchestrator | } 2026-04-07 00:51:37.330838 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:51:37.330845 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:51:37.330851 | orchestrator | } 2026-04-07 00:51:37.330857 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:51:37.330864 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:51:37.330870 | orchestrator | } 2026-04-07 00:51:37.330876 | orchestrator | 2026-04-07 00:51:37.330883 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:51:37.330889 | orchestrator | Tuesday 07 April 2026 00:51:28 +0000 (0:00:00.324) 0:05:13.183 ********* 2026-04-07 00:51:37.330899 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.330910 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.330917 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.330923 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:51:37.330933 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.330939 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.330946 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.330953 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:51:37.330960 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-07 00:51:37.330974 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-07 00:51:37.330981 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-07 00:51:37.330987 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:51:37.330993 | orchestrator | 2026-04-07 00:51:37.330999 | orchestrator | RUNNING HANDLER [loadbalancer : Check IP addresses on the API interface] ******* 2026-04-07 00:51:37.331006 | orchestrator | Tuesday 07 April 2026 00:51:29 +0000 (0:00:01.609) 0:05:14.792 ********* 2026-04-07 00:51:37.331012 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.331019 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.331025 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.331031 | orchestrator | 2026-04-07 00:51:37.331038 | orchestrator | RUNNING HANDLER [loadbalancer : Group HA nodes by status] ********************** 2026-04-07 00:51:37.331044 | orchestrator | Tuesday 07 April 2026 00:51:30 +0000 (0:00:00.670) 0:05:15.462 ********* 2026-04-07 00:51:37.331050 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.331056 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.331062 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.331069 | orchestrator | 2026-04-07 00:51:37.331077 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup keepalived container] ************** 2026-04-07 00:51:37.331084 | orchestrator | Tuesday 07 April 2026 00:51:31 +0000 (0:00:00.708) 0:05:16.171 ********* 2026-04-07 00:51:37.331090 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.331096 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.331103 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.331109 | orchestrator | 2026-04-07 00:51:37.331115 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup haproxy container] ***************** 2026-04-07 00:51:37.331121 | orchestrator | Tuesday 07 April 2026 00:51:32 +0000 (0:00:00.856) 0:05:17.027 ********* 2026-04-07 00:51:37.331127 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.331132 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.331138 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.331144 | orchestrator | 2026-04-07 00:51:37.331151 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup proxysql container] **************** 2026-04-07 00:51:37.331157 | orchestrator | Tuesday 07 April 2026 00:51:32 +0000 (0:00:00.828) 0:05:17.856 ********* 2026-04-07 00:51:37.331163 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:51:37.331169 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:51:37.331175 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:51:37.331181 | orchestrator | 2026-04-07 00:51:37.331188 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup haproxy container] **************** 2026-04-07 00:51:37.331195 | orchestrator | Tuesday 07 April 2026 00:51:33 +0000 (0:00:00.870) 0:05:18.726 ********* 2026-04-07 00:51:37.331210 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fhaproxy\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_i0i6zisy/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_i0i6zisy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_i0i6zisy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_i0i6zisy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fhaproxy: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:51:37.331222 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fhaproxy\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_rn3abnvk/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_rn3abnvk/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_rn3abnvk/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_rn3abnvk/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fhaproxy: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:51:37.331238 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fhaproxy\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_v3ofr__z/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_v3ofr__z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_v3ofr__z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_v3ofr__z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fhaproxy: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:51:37.331245 | orchestrator | 2026-04-07 00:51:37.331255 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:51:37.331262 | orchestrator | testbed-node-0 : ok=120  changed=76  unreachable=0 failed=1  skipped=88  rescued=0 ignored=0 2026-04-07 00:51:37.331269 | orchestrator | testbed-node-1 : ok=119  changed=76  unreachable=0 failed=1  skipped=88  rescued=0 ignored=0 2026-04-07 00:51:37.331275 | orchestrator | testbed-node-2 : ok=119  changed=76  unreachable=0 failed=1  skipped=88  rescued=0 ignored=0 2026-04-07 00:51:37.331282 | orchestrator | 2026-04-07 00:51:37.331288 | orchestrator | 2026-04-07 00:51:37.331298 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:51:37.331304 | orchestrator | Tuesday 07 April 2026 00:51:36 +0000 (0:00:02.721) 0:05:21.448 ********* 2026-04-07 00:51:37.331311 | orchestrator | =============================================================================== 2026-04-07 00:51:37.331317 | orchestrator | haproxy-config : Copying over skyline haproxy config -------------------- 6.23s 2026-04-07 00:51:37.331323 | orchestrator | sysctl : Setting sysctl values ------------------------------------------ 5.91s 2026-04-07 00:51:37.331329 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 5.14s 2026-04-07 00:51:37.331335 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 4.87s 2026-04-07 00:51:37.331358 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 4.40s 2026-04-07 00:51:37.331366 | orchestrator | haproxy-config : Copying over designate haproxy config ------------------ 4.37s 2026-04-07 00:51:37.331372 | orchestrator | haproxy-config : Copying over barbican haproxy config ------------------- 4.20s 2026-04-07 00:51:37.331378 | orchestrator | haproxy-config : Copying over magnum haproxy config --------------------- 4.06s 2026-04-07 00:51:37.331384 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 3.94s 2026-04-07 00:51:37.331391 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 3.84s 2026-04-07 00:51:37.331397 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 3.66s 2026-04-07 00:51:37.331403 | orchestrator | haproxy-config : Copying over manila haproxy config --------------------- 3.65s 2026-04-07 00:51:37.331409 | orchestrator | haproxy-config : Copying over keystone haproxy config ------------------- 3.64s 2026-04-07 00:51:37.331415 | orchestrator | loadbalancer : Copying checks for services which are enabled ------------ 3.58s 2026-04-07 00:51:37.331422 | orchestrator | haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config --- 3.53s 2026-04-07 00:51:37.331428 | orchestrator | haproxy-config : Copying over octavia haproxy config -------------------- 3.46s 2026-04-07 00:51:37.331435 | orchestrator | haproxy-config : Copying over horizon haproxy config -------------------- 3.43s 2026-04-07 00:51:37.331441 | orchestrator | loadbalancer : Copying over proxysql config ----------------------------- 3.38s 2026-04-07 00:51:37.331447 | orchestrator | haproxy-config : Configuring firewall for glance ------------------------ 3.36s 2026-04-07 00:51:37.331454 | orchestrator | loadbalancer : Copying over config.json files for services -------------- 3.30s 2026-04-07 00:51:37.331463 | orchestrator | 2026-04-07 00:51:37 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:37.331470 | orchestrator | 2026-04-07 00:51:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:40.358849 | orchestrator | 2026-04-07 00:51:40 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:40.360311 | orchestrator | 2026-04-07 00:51:40 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:40.361839 | orchestrator | 2026-04-07 00:51:40 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:40.361896 | orchestrator | 2026-04-07 00:51:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:43.397867 | orchestrator | 2026-04-07 00:51:43 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:43.400058 | orchestrator | 2026-04-07 00:51:43 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:43.400748 | orchestrator | 2026-04-07 00:51:43 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:43.400772 | orchestrator | 2026-04-07 00:51:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:46.426930 | orchestrator | 2026-04-07 00:51:46 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:46.427058 | orchestrator | 2026-04-07 00:51:46 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:46.428194 | orchestrator | 2026-04-07 00:51:46 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:46.428225 | orchestrator | 2026-04-07 00:51:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:49.458166 | orchestrator | 2026-04-07 00:51:49 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:49.458361 | orchestrator | 2026-04-07 00:51:49 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:49.459873 | orchestrator | 2026-04-07 00:51:49 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:49.459900 | orchestrator | 2026-04-07 00:51:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:52.485883 | orchestrator | 2026-04-07 00:51:52 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:52.486637 | orchestrator | 2026-04-07 00:51:52 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:52.487835 | orchestrator | 2026-04-07 00:51:52 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:52.487929 | orchestrator | 2026-04-07 00:51:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:55.521004 | orchestrator | 2026-04-07 00:51:55 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:55.521501 | orchestrator | 2026-04-07 00:51:55 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:55.521965 | orchestrator | 2026-04-07 00:51:55 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:55.522009 | orchestrator | 2026-04-07 00:51:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:51:58.546354 | orchestrator | 2026-04-07 00:51:58 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:51:58.547985 | orchestrator | 2026-04-07 00:51:58 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:51:58.548642 | orchestrator | 2026-04-07 00:51:58 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:51:58.548677 | orchestrator | 2026-04-07 00:51:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:01.577577 | orchestrator | 2026-04-07 00:52:01 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:01.580082 | orchestrator | 2026-04-07 00:52:01 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state STARTED 2026-04-07 00:52:01.582239 | orchestrator | 2026-04-07 00:52:01 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:01.582289 | orchestrator | 2026-04-07 00:52:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:04.622174 | orchestrator | 2026-04-07 00:52:04 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:04.626040 | orchestrator | 2026-04-07 00:52:04 | INFO  | Task 69b43c7f-a84d-4fd4-bc1a-5aaf781ad0ac is in state SUCCESS 2026-04-07 00:52:04.628559 | orchestrator | 2026-04-07 00:52:04.628607 | orchestrator | 2026-04-07 00:52:04.628615 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:52:04.628622 | orchestrator | 2026-04-07 00:52:04.628629 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:52:04.628636 | orchestrator | Tuesday 07 April 2026 00:51:40 +0000 (0:00:00.326) 0:00:00.326 ********* 2026-04-07 00:52:04.628643 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:52:04.628651 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:52:04.628675 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:52:04.628681 | orchestrator | 2026-04-07 00:52:04.628688 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:52:04.628695 | orchestrator | Tuesday 07 April 2026 00:51:40 +0000 (0:00:00.277) 0:00:00.604 ********* 2026-04-07 00:52:04.628702 | orchestrator | ok: [testbed-node-0] => (item=enable_opensearch_True) 2026-04-07 00:52:04.628709 | orchestrator | ok: [testbed-node-1] => (item=enable_opensearch_True) 2026-04-07 00:52:04.628716 | orchestrator | ok: [testbed-node-2] => (item=enable_opensearch_True) 2026-04-07 00:52:04.628722 | orchestrator | 2026-04-07 00:52:04.628728 | orchestrator | PLAY [Apply role opensearch] *************************************************** 2026-04-07 00:52:04.628735 | orchestrator | 2026-04-07 00:52:04.628741 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-04-07 00:52:04.628748 | orchestrator | Tuesday 07 April 2026 00:51:40 +0000 (0:00:00.265) 0:00:00.870 ********* 2026-04-07 00:52:04.628755 | orchestrator | included: /ansible/roles/opensearch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:52:04.628762 | orchestrator | 2026-04-07 00:52:04.628768 | orchestrator | TASK [opensearch : Setting sysctl values] ************************************** 2026-04-07 00:52:04.628797 | orchestrator | Tuesday 07 April 2026 00:51:41 +0000 (0:00:00.464) 0:00:01.335 ********* 2026-04-07 00:52:04.628805 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-07 00:52:04.628812 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-07 00:52:04.628818 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-07 00:52:04.628825 | orchestrator | 2026-04-07 00:52:04.628831 | orchestrator | TASK [opensearch : Ensuring config directories exist] ************************** 2026-04-07 00:52:04.628846 | orchestrator | Tuesday 07 April 2026 00:51:42 +0000 (0:00:00.986) 0:00:02.321 ********* 2026-04-07 00:52:04.628904 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.628916 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.628931 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.628947 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629071 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629081 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629091 | orchestrator | 2026-04-07 00:52:04.629098 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-04-07 00:52:04.629105 | orchestrator | Tuesday 07 April 2026 00:51:43 +0000 (0:00:01.167) 0:00:03.488 ********* 2026-04-07 00:52:04.629111 | orchestrator | included: /ansible/roles/opensearch/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:52:04.629117 | orchestrator | 2026-04-07 00:52:04.629131 | orchestrator | TASK [service-cert-copy : opensearch | Copying over extra CA certificates] ***** 2026-04-07 00:52:04.629137 | orchestrator | Tuesday 07 April 2026 00:51:44 +0000 (0:00:00.431) 0:00:03.920 ********* 2026-04-07 00:52:04.629143 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629153 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629159 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629166 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629223 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629236 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629242 | orchestrator | 2026-04-07 00:52:04.629248 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS certificate] *** 2026-04-07 00:52:04.629254 | orchestrator | Tuesday 07 April 2026 00:51:46 +0000 (0:00:02.332) 0:00:06.252 ********* 2026-04-07 00:52:04.629259 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629276 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629282 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629297 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:04.629304 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629313 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:04.629324 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629331 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:04.629337 | orchestrator | 2026-04-07 00:52:04.629344 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS key] *** 2026-04-07 00:52:04.629349 | orchestrator | Tuesday 07 April 2026 00:51:47 +0000 (0:00:00.762) 0:00:07.015 ********* 2026-04-07 00:52:04.629357 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629364 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629370 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629380 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:04.629390 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629397 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:04.629406 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629412 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629424 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:04.629443 | orchestrator | 2026-04-07 00:52:04.629451 | orchestrator | TASK [opensearch : Copying over config.json files for services] **************** 2026-04-07 00:52:04.629457 | orchestrator | Tuesday 07 April 2026 00:51:48 +0000 (0:00:00.966) 0:00:07.981 ********* 2026-04-07 00:52:04.629467 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629474 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629483 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629490 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629505 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629512 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629519 | orchestrator | 2026-04-07 00:52:04.629525 | orchestrator | TASK [opensearch : Copying over opensearch service config file] **************** 2026-04-07 00:52:04.629532 | orchestrator | Tuesday 07 April 2026 00:51:50 +0000 (0:00:02.265) 0:00:10.246 ********* 2026-04-07 00:52:04.629539 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:52:04.629545 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:04.629551 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:52:04.629557 | orchestrator | 2026-04-07 00:52:04.629566 | orchestrator | TASK [opensearch : Copying over opensearch-dashboards config file] ************* 2026-04-07 00:52:04.629573 | orchestrator | Tuesday 07 April 2026 00:51:52 +0000 (0:00:02.479) 0:00:12.725 ********* 2026-04-07 00:52:04.629578 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:04.629585 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:52:04.629591 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:52:04.629598 | orchestrator | 2026-04-07 00:52:04.629604 | orchestrator | TASK [service-check-containers : opensearch | Check containers] **************** 2026-04-07 00:52:04.629614 | orchestrator | Tuesday 07 April 2026 00:51:54 +0000 (0:00:01.577) 0:00:14.303 ********* 2026-04-07 00:52:04.629621 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629628 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629639 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 00:52:04.629649 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629661 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629671 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-07 00:52:04.629678 | orchestrator | 2026-04-07 00:52:04.629685 | orchestrator | TASK [service-check-containers : opensearch | Notify handlers to restart containers] *** 2026-04-07 00:52:04.629692 | orchestrator | Tuesday 07 April 2026 00:51:56 +0000 (0:00:02.039) 0:00:16.342 ********* 2026-04-07 00:52:04.629698 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:52:04.629704 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:52:04.629711 | orchestrator | } 2026-04-07 00:52:04.629718 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:52:04.629724 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:52:04.629730 | orchestrator | } 2026-04-07 00:52:04.629736 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:52:04.629743 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:52:04.629749 | orchestrator | } 2026-04-07 00:52:04.629755 | orchestrator | 2026-04-07 00:52:04.629762 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:52:04.629767 | orchestrator | Tuesday 07 April 2026 00:51:56 +0000 (0:00:00.390) 0:00:16.733 ********* 2026-04-07 00:52:04.629774 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629788 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629795 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:04.629802 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629813 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629820 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:04.629830 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 00:52:04.629840 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release//opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-07 00:52:04.629847 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:04.629853 | orchestrator | 2026-04-07 00:52:04.629859 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-04-07 00:52:04.629865 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:00.699) 0:00:17.432 ********* 2026-04-07 00:52:04.629872 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:04.629878 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:04.629884 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:04.629890 | orchestrator | 2026-04-07 00:52:04.629897 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-04-07 00:52:04.629903 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:00.248) 0:00:17.681 ********* 2026-04-07 00:52:04.629909 | orchestrator | 2026-04-07 00:52:04.629916 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-04-07 00:52:04.629922 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:00.061) 0:00:17.743 ********* 2026-04-07 00:52:04.629930 | orchestrator | 2026-04-07 00:52:04.629936 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-04-07 00:52:04.629943 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:00.057) 0:00:17.801 ********* 2026-04-07 00:52:04.629949 | orchestrator | 2026-04-07 00:52:04.629955 | orchestrator | RUNNING HANDLER [opensearch : Disable shard allocation] ************************ 2026-04-07 00:52:04.629961 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:00.058) 0:00:17.860 ********* 2026-04-07 00:52:04.629967 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:04.629973 | orchestrator | 2026-04-07 00:52:04.629980 | orchestrator | RUNNING HANDLER [opensearch : Perform a flush] ********************************* 2026-04-07 00:52:04.629990 | orchestrator | Tuesday 07 April 2026 00:51:58 +0000 (0:00:00.435) 0:00:18.295 ********* 2026-04-07 00:52:04.629997 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:04.630004 | orchestrator | 2026-04-07 00:52:04.630040 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch container] ******************** 2026-04-07 00:52:04.630049 | orchestrator | Tuesday 07 April 2026 00:51:58 +0000 (0:00:00.179) 0:00:18.475 ********* 2026-04-07 00:52:04.630066 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopensearch\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_bfv2_fpd/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_bfv2_fpd/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_bfv2_fpd/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_bfv2_fpd/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopensearch: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:52:04.630082 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopensearch\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_3j96qep3/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_3j96qep3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_3j96qep3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_3j96qep3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopensearch: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:52:04.630098 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopensearch\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_oqazgp4b/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_oqazgp4b/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_oqazgp4b/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_oqazgp4b/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fopensearch: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:52:04.630107 | orchestrator | 2026-04-07 00:52:04.630114 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:52:04.630121 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-07 00:52:04.630129 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:52:04.630136 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 00:52:04.630143 | orchestrator | 2026-04-07 00:52:04.630150 | orchestrator | 2026-04-07 00:52:04.630166 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:52:04.630173 | orchestrator | Tuesday 07 April 2026 00:52:01 +0000 (0:00:03.306) 0:00:21.781 ********* 2026-04-07 00:52:04.630180 | orchestrator | =============================================================================== 2026-04-07 00:52:04.630186 | orchestrator | opensearch : Restart opensearch container ------------------------------- 3.31s 2026-04-07 00:52:04.630193 | orchestrator | opensearch : Copying over opensearch service config file ---------------- 2.48s 2026-04-07 00:52:04.630200 | orchestrator | service-cert-copy : opensearch | Copying over extra CA certificates ----- 2.33s 2026-04-07 00:52:04.630207 | orchestrator | opensearch : Copying over config.json files for services ---------------- 2.27s 2026-04-07 00:52:04.630214 | orchestrator | service-check-containers : opensearch | Check containers ---------------- 2.04s 2026-04-07 00:52:04.630221 | orchestrator | opensearch : Copying over opensearch-dashboards config file ------------- 1.58s 2026-04-07 00:52:04.630228 | orchestrator | opensearch : Ensuring config directories exist -------------------------- 1.17s 2026-04-07 00:52:04.630234 | orchestrator | opensearch : Setting sysctl values -------------------------------------- 0.99s 2026-04-07 00:52:04.630241 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS key --- 0.97s 2026-04-07 00:52:04.630247 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS certificate --- 0.76s 2026-04-07 00:52:04.630254 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.70s 2026-04-07 00:52:04.630261 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.46s 2026-04-07 00:52:04.630267 | orchestrator | opensearch : Disable shard allocation ----------------------------------- 0.44s 2026-04-07 00:52:04.630274 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.43s 2026-04-07 00:52:04.630280 | orchestrator | service-check-containers : opensearch | Notify handlers to restart containers --- 0.39s 2026-04-07 00:52:04.630287 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.28s 2026-04-07 00:52:04.630293 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.27s 2026-04-07 00:52:04.630300 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.25s 2026-04-07 00:52:04.630310 | orchestrator | opensearch : Perform a flush -------------------------------------------- 0.18s 2026-04-07 00:52:04.630317 | orchestrator | opensearch : Flush handlers --------------------------------------------- 0.18s 2026-04-07 00:52:04.630323 | orchestrator | 2026-04-07 00:52:04 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:04.630330 | orchestrator | 2026-04-07 00:52:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:07.670599 | orchestrator | 2026-04-07 00:52:07 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:07.671106 | orchestrator | 2026-04-07 00:52:07 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:07.671886 | orchestrator | 2026-04-07 00:52:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:10.723298 | orchestrator | 2026-04-07 00:52:10 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:10.725016 | orchestrator | 2026-04-07 00:52:10 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:10.725132 | orchestrator | 2026-04-07 00:52:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:13.774268 | orchestrator | 2026-04-07 00:52:13 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:13.774895 | orchestrator | 2026-04-07 00:52:13 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:13.774924 | orchestrator | 2026-04-07 00:52:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:16.815001 | orchestrator | 2026-04-07 00:52:16 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:16.816962 | orchestrator | 2026-04-07 00:52:16 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:16.817030 | orchestrator | 2026-04-07 00:52:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:19.860913 | orchestrator | 2026-04-07 00:52:19 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:19.861256 | orchestrator | 2026-04-07 00:52:19 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:19.861302 | orchestrator | 2026-04-07 00:52:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:22.901408 | orchestrator | 2026-04-07 00:52:22 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:22.903031 | orchestrator | 2026-04-07 00:52:22 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:22.903090 | orchestrator | 2026-04-07 00:52:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:25.947646 | orchestrator | 2026-04-07 00:52:25 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:25.949819 | orchestrator | 2026-04-07 00:52:25 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:25.949871 | orchestrator | 2026-04-07 00:52:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:28.989806 | orchestrator | 2026-04-07 00:52:28 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:28.992051 | orchestrator | 2026-04-07 00:52:28 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:28.992130 | orchestrator | 2026-04-07 00:52:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:32.038677 | orchestrator | 2026-04-07 00:52:32 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:32.040292 | orchestrator | 2026-04-07 00:52:32 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:32.040371 | orchestrator | 2026-04-07 00:52:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:35.085430 | orchestrator | 2026-04-07 00:52:35 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:35.088474 | orchestrator | 2026-04-07 00:52:35 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:35.088603 | orchestrator | 2026-04-07 00:52:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:38.133301 | orchestrator | 2026-04-07 00:52:38 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:38.134276 | orchestrator | 2026-04-07 00:52:38 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:38.134316 | orchestrator | 2026-04-07 00:52:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:41.172743 | orchestrator | 2026-04-07 00:52:41 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:41.174132 | orchestrator | 2026-04-07 00:52:41 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:41.174170 | orchestrator | 2026-04-07 00:52:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:44.221200 | orchestrator | 2026-04-07 00:52:44 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:44.223073 | orchestrator | 2026-04-07 00:52:44 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:44.223150 | orchestrator | 2026-04-07 00:52:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:47.270009 | orchestrator | 2026-04-07 00:52:47 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state STARTED 2026-04-07 00:52:47.272548 | orchestrator | 2026-04-07 00:52:47 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:47.272918 | orchestrator | 2026-04-07 00:52:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:50.305115 | orchestrator | 2026-04-07 00:52:50 | INFO  | Task e38c6420-ea9c-4a80-b014-caa1ca3517f2 is in state SUCCESS 2026-04-07 00:52:50.305767 | orchestrator | 2026-04-07 00:52:50.305799 | orchestrator | 2026-04-07 00:52:50.305809 | orchestrator | PLAY [Set kolla_action_mariadb] ************************************************ 2026-04-07 00:52:50.305858 | orchestrator | 2026-04-07 00:52:50.305869 | orchestrator | TASK [Inform the user about the following task] ******************************** 2026-04-07 00:52:50.305877 | orchestrator | Tuesday 07 April 2026 00:51:40 +0000 (0:00:00.101) 0:00:00.101 ********* 2026-04-07 00:52:50.305885 | orchestrator | ok: [localhost] => { 2026-04-07 00:52:50.305895 | orchestrator |  "msg": "The task 'Check MariaDB service' fails if the MariaDB service has not yet been deployed. This is fine." 2026-04-07 00:52:50.305902 | orchestrator | } 2026-04-07 00:52:50.306110 | orchestrator | 2026-04-07 00:52:50.306119 | orchestrator | TASK [Check MariaDB service] *************************************************** 2026-04-07 00:52:50.306126 | orchestrator | Tuesday 07 April 2026 00:51:40 +0000 (0:00:00.044) 0:00:00.145 ********* 2026-04-07 00:52:50.306134 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.9:3306"} 2026-04-07 00:52:50.306144 | orchestrator | ...ignoring 2026-04-07 00:52:50.306150 | orchestrator | 2026-04-07 00:52:50.306156 | orchestrator | TASK [Set kolla_action_mariadb = upgrade if MariaDB is already running] ******** 2026-04-07 00:52:50.306161 | orchestrator | Tuesday 07 April 2026 00:51:43 +0000 (0:00:02.823) 0:00:02.968 ********* 2026-04-07 00:52:50.306165 | orchestrator | skipping: [localhost] 2026-04-07 00:52:50.306170 | orchestrator | 2026-04-07 00:52:50.306174 | orchestrator | TASK [Set kolla_action_mariadb = kolla_action_ng] ****************************** 2026-04-07 00:52:50.306179 | orchestrator | Tuesday 07 April 2026 00:51:43 +0000 (0:00:00.052) 0:00:03.021 ********* 2026-04-07 00:52:50.306183 | orchestrator | ok: [localhost] 2026-04-07 00:52:50.306188 | orchestrator | 2026-04-07 00:52:50.306192 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:52:50.306197 | orchestrator | 2026-04-07 00:52:50.306202 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:52:50.306206 | orchestrator | Tuesday 07 April 2026 00:51:43 +0000 (0:00:00.164) 0:00:03.186 ********* 2026-04-07 00:52:50.306210 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:52:50.306215 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:52:50.306219 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:52:50.306224 | orchestrator | 2026-04-07 00:52:50.306229 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:52:50.306233 | orchestrator | Tuesday 07 April 2026 00:51:43 +0000 (0:00:00.256) 0:00:03.442 ********* 2026-04-07 00:52:50.306238 | orchestrator | ok: [testbed-node-0] => (item=enable_mariadb_True) 2026-04-07 00:52:50.306243 | orchestrator | ok: [testbed-node-1] => (item=enable_mariadb_True) 2026-04-07 00:52:50.306247 | orchestrator | ok: [testbed-node-2] => (item=enable_mariadb_True) 2026-04-07 00:52:50.306251 | orchestrator | 2026-04-07 00:52:50.306256 | orchestrator | PLAY [Apply role mariadb] ****************************************************** 2026-04-07 00:52:50.306260 | orchestrator | 2026-04-07 00:52:50.306265 | orchestrator | TASK [mariadb : Group MariaDB hosts based on shards] *************************** 2026-04-07 00:52:50.306269 | orchestrator | Tuesday 07 April 2026 00:51:43 +0000 (0:00:00.338) 0:00:03.781 ********* 2026-04-07 00:52:50.306273 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-07 00:52:50.306278 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2026-04-07 00:52:50.306301 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2026-04-07 00:52:50.306306 | orchestrator | 2026-04-07 00:52:50.306310 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-07 00:52:50.306315 | orchestrator | Tuesday 07 April 2026 00:51:44 +0000 (0:00:00.321) 0:00:04.103 ********* 2026-04-07 00:52:50.306319 | orchestrator | included: /ansible/roles/mariadb/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:52:50.306325 | orchestrator | 2026-04-07 00:52:50.306329 | orchestrator | TASK [mariadb : Ensuring config directories exist] ***************************** 2026-04-07 00:52:50.306333 | orchestrator | Tuesday 07 April 2026 00:51:44 +0000 (0:00:00.584) 0:00:04.688 ********* 2026-04-07 00:52:50.306365 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.306373 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.306385 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.306393 | orchestrator | 2026-04-07 00:52:50.306399 | orchestrator | TASK [mariadb : Ensuring database backup config directory exists] ************** 2026-04-07 00:52:50.306404 | orchestrator | Tuesday 07 April 2026 00:51:47 +0000 (0:00:02.504) 0:00:07.192 ********* 2026-04-07 00:52:50.306409 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.306415 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.306420 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:50.306425 | orchestrator | 2026-04-07 00:52:50.306430 | orchestrator | TASK [mariadb : Copying over my.cnf for mariabackup] *************************** 2026-04-07 00:52:50.306435 | orchestrator | Tuesday 07 April 2026 00:51:47 +0000 (0:00:00.646) 0:00:07.838 ********* 2026-04-07 00:52:50.306441 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.306446 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.306451 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:50.306456 | orchestrator | 2026-04-07 00:52:50.306461 | orchestrator | TASK [mariadb : Copying over config.json files for services] ******************* 2026-04-07 00:52:50.306466 | orchestrator | Tuesday 07 April 2026 00:51:49 +0000 (0:00:01.245) 0:00:09.084 ********* 2026-04-07 00:52:50.306495 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.306514 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.306520 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.306529 | orchestrator | 2026-04-07 00:52:50.306534 | orchestrator | TASK [mariadb : Copying over config.json files for mariabackup] **************** 2026-04-07 00:52:50.306539 | orchestrator | Tuesday 07 April 2026 00:51:52 +0000 (0:00:03.631) 0:00:12.715 ********* 2026-04-07 00:52:50.306544 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.306550 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.306898 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:50.306911 | orchestrator | 2026-04-07 00:52:50.306918 | orchestrator | TASK [mariadb : Copying over galera.cnf] *************************************** 2026-04-07 00:52:50.306926 | orchestrator | Tuesday 07 April 2026 00:51:54 +0000 (0:00:01.332) 0:00:14.048 ********* 2026-04-07 00:52:50.306932 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:52:50.306939 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:50.306946 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:52:50.306953 | orchestrator | 2026-04-07 00:52:50.306961 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-07 00:52:50.306968 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:03.300) 0:00:17.348 ********* 2026-04-07 00:52:50.306975 | orchestrator | included: /ansible/roles/mariadb/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:52:50.306982 | orchestrator | 2026-04-07 00:52:50.306997 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2026-04-07 00:52:50.307004 | orchestrator | Tuesday 07 April 2026 00:51:57 +0000 (0:00:00.449) 0:00:17.798 ********* 2026-04-07 00:52:50.307022 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307041 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307049 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307056 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307074 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307083 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307091 | orchestrator | 2026-04-07 00:52:50.307104 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2026-04-07 00:52:50.307111 | orchestrator | Tuesday 07 April 2026 00:52:00 +0000 (0:00:02.159) 0:00:19.958 ********* 2026-04-07 00:52:50.307118 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307126 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307144 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307152 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307223 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307232 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307239 | orchestrator | 2026-04-07 00:52:50.307246 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2026-04-07 00:52:50.307253 | orchestrator | Tuesday 07 April 2026 00:52:02 +0000 (0:00:02.289) 0:00:22.247 ********* 2026-04-07 00:52:50.307270 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307278 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307296 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307304 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307316 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307324 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307331 | orchestrator | 2026-04-07 00:52:50.307338 | orchestrator | TASK [service-check-containers : mariadb | Check containers] ******************* 2026-04-07 00:52:50.307349 | orchestrator | Tuesday 07 April 2026 00:52:04 +0000 (0:00:02.122) 0:00:24.370 ********* 2026-04-07 00:52:50.307362 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.307374 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.307389 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-07 00:52:50.307401 | orchestrator | 2026-04-07 00:52:50.307408 | orchestrator | TASK [service-check-containers : mariadb | Notify handlers to restart containers] *** 2026-04-07 00:52:50.307415 | orchestrator | Tuesday 07 April 2026 00:52:07 +0000 (0:00:02.522) 0:00:26.892 ********* 2026-04-07 00:52:50.307422 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:52:50.307430 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:52:50.307437 | orchestrator | } 2026-04-07 00:52:50.307444 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:52:50.307451 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:52:50.307458 | orchestrator | } 2026-04-07 00:52:50.307465 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:52:50.307472 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:52:50.307480 | orchestrator | } 2026-04-07 00:52:50.307487 | orchestrator | 2026-04-07 00:52:50.307494 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:52:50.307501 | orchestrator | Tuesday 07 April 2026 00:52:07 +0000 (0:00:00.276) 0:00:27.169 ********* 2026-04-07 00:52:50.307512 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307525 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307538 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307546 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307557 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.307591 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307599 | orchestrator | 2026-04-07 00:52:50.307606 | orchestrator | TASK [mariadb : Checking for mariadb cluster] ********************************** 2026-04-07 00:52:50.307613 | orchestrator | Tuesday 07 April 2026 00:52:09 +0000 (0:00:01.942) 0:00:29.112 ********* 2026-04-07 00:52:50.307624 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307631 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307638 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307645 | orchestrator | 2026-04-07 00:52:50.307652 | orchestrator | TASK [mariadb : Cleaning up temp file on localhost] **************************** 2026-04-07 00:52:50.307658 | orchestrator | Tuesday 07 April 2026 00:52:09 +0000 (0:00:00.391) 0:00:29.503 ********* 2026-04-07 00:52:50.307665 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307672 | orchestrator | 2026-04-07 00:52:50.307683 | orchestrator | TASK [mariadb : Stop MariaDB containers] *************************************** 2026-04-07 00:52:50.307690 | orchestrator | Tuesday 07 April 2026 00:52:09 +0000 (0:00:00.096) 0:00:29.600 ********* 2026-04-07 00:52:50.307696 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307703 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307710 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307717 | orchestrator | 2026-04-07 00:52:50.307724 | orchestrator | TASK [mariadb : Run MariaDB wsrep recovery] ************************************ 2026-04-07 00:52:50.307731 | orchestrator | Tuesday 07 April 2026 00:52:10 +0000 (0:00:00.273) 0:00:29.873 ********* 2026-04-07 00:52:50.307738 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307745 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307752 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307758 | orchestrator | 2026-04-07 00:52:50.307765 | orchestrator | TASK [mariadb : Copying MariaDB log file to /tmp] ****************************** 2026-04-07 00:52:50.307772 | orchestrator | Tuesday 07 April 2026 00:52:10 +0000 (0:00:00.299) 0:00:30.172 ********* 2026-04-07 00:52:50.307778 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307785 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307792 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307799 | orchestrator | 2026-04-07 00:52:50.307806 | orchestrator | TASK [mariadb : Get MariaDB wsrep recovery seqno] ****************************** 2026-04-07 00:52:50.307813 | orchestrator | Tuesday 07 April 2026 00:52:10 +0000 (0:00:00.268) 0:00:30.440 ********* 2026-04-07 00:52:50.307820 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307826 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307833 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307840 | orchestrator | 2026-04-07 00:52:50.307847 | orchestrator | TASK [mariadb : Removing MariaDB log file from /tmp] *************************** 2026-04-07 00:52:50.307854 | orchestrator | Tuesday 07 April 2026 00:52:10 +0000 (0:00:00.376) 0:00:30.817 ********* 2026-04-07 00:52:50.307861 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307868 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307875 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307882 | orchestrator | 2026-04-07 00:52:50.307889 | orchestrator | TASK [mariadb : Registering MariaDB seqno variable] **************************** 2026-04-07 00:52:50.307896 | orchestrator | Tuesday 07 April 2026 00:52:11 +0000 (0:00:00.259) 0:00:31.076 ********* 2026-04-07 00:52:50.307903 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307910 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.307917 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.307924 | orchestrator | 2026-04-07 00:52:50.307931 | orchestrator | TASK [mariadb : Comparing seqno value on all mariadb hosts] ******************** 2026-04-07 00:52:50.307938 | orchestrator | Tuesday 07 April 2026 00:52:11 +0000 (0:00:00.263) 0:00:31.340 ********* 2026-04-07 00:52:50.307945 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:52:50.307952 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:52:50.307964 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:52:50.307971 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.307978 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-04-07 00:52:50.307985 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-04-07 00:52:50.307993 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-04-07 00:52:50.308000 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308007 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-04-07 00:52:50.308014 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-04-07 00:52:50.308021 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-04-07 00:52:50.308028 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308035 | orchestrator | 2026-04-07 00:52:50.308042 | orchestrator | TASK [mariadb : Writing hostname of host with the largest seqno to temp file] *** 2026-04-07 00:52:50.308050 | orchestrator | Tuesday 07 April 2026 00:52:11 +0000 (0:00:00.284) 0:00:31.625 ********* 2026-04-07 00:52:50.308056 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308063 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308145 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308154 | orchestrator | 2026-04-07 00:52:50.308161 | orchestrator | TASK [mariadb : Registering mariadb_recover_inventory_name from temp file] ***** 2026-04-07 00:52:50.308168 | orchestrator | Tuesday 07 April 2026 00:52:12 +0000 (0:00:00.384) 0:00:32.010 ********* 2026-04-07 00:52:50.308176 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308183 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308190 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308196 | orchestrator | 2026-04-07 00:52:50.308209 | orchestrator | TASK [mariadb : Store bootstrap and master hostnames into facts] *************** 2026-04-07 00:52:50.308216 | orchestrator | Tuesday 07 April 2026 00:52:12 +0000 (0:00:00.258) 0:00:32.268 ********* 2026-04-07 00:52:50.308222 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308229 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308236 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308243 | orchestrator | 2026-04-07 00:52:50.308250 | orchestrator | TASK [mariadb : Set grastate.dat file from MariaDB container in bootstrap host] *** 2026-04-07 00:52:50.308256 | orchestrator | Tuesday 07 April 2026 00:52:12 +0000 (0:00:00.263) 0:00:32.532 ********* 2026-04-07 00:52:50.308264 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308271 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308278 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308285 | orchestrator | 2026-04-07 00:52:50.308291 | orchestrator | TASK [mariadb : Starting first MariaDB container] ****************************** 2026-04-07 00:52:50.308298 | orchestrator | Tuesday 07 April 2026 00:52:12 +0000 (0:00:00.294) 0:00:32.826 ********* 2026-04-07 00:52:50.308305 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308312 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308319 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308326 | orchestrator | 2026-04-07 00:52:50.308332 | orchestrator | TASK [mariadb : Wait for first MariaDB container] ****************************** 2026-04-07 00:52:50.308339 | orchestrator | Tuesday 07 April 2026 00:52:13 +0000 (0:00:00.470) 0:00:33.297 ********* 2026-04-07 00:52:50.308345 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308353 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308363 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308370 | orchestrator | 2026-04-07 00:52:50.308377 | orchestrator | TASK [mariadb : Set first MariaDB container as primary] ************************ 2026-04-07 00:52:50.308384 | orchestrator | Tuesday 07 April 2026 00:52:13 +0000 (0:00:00.297) 0:00:33.594 ********* 2026-04-07 00:52:50.308391 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308397 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308404 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308411 | orchestrator | 2026-04-07 00:52:50.308417 | orchestrator | TASK [mariadb : Wait for MariaDB to become operational] ************************ 2026-04-07 00:52:50.308430 | orchestrator | Tuesday 07 April 2026 00:52:14 +0000 (0:00:00.303) 0:00:33.897 ********* 2026-04-07 00:52:50.308437 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308444 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308451 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308458 | orchestrator | 2026-04-07 00:52:50.308465 | orchestrator | TASK [mariadb : Restart slave MariaDB container(s)] **************************** 2026-04-07 00:52:50.308471 | orchestrator | Tuesday 07 April 2026 00:52:14 +0000 (0:00:00.290) 0:00:34.188 ********* 2026-04-07 00:52:50.308479 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.308487 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308504 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.308515 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308523 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.308530 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308537 | orchestrator | 2026-04-07 00:52:50.308544 | orchestrator | TASK [mariadb : Wait for slave MariaDB] **************************************** 2026-04-07 00:52:50.308551 | orchestrator | Tuesday 07 April 2026 00:52:16 +0000 (0:00:02.029) 0:00:36.218 ********* 2026-04-07 00:52:50.308558 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308565 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308588 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308596 | orchestrator | 2026-04-07 00:52:50.308603 | orchestrator | TASK [mariadb : Restart master MariaDB container(s)] *************************** 2026-04-07 00:52:50.308609 | orchestrator | Tuesday 07 April 2026 00:52:16 +0000 (0:00:00.496) 0:00:36.715 ********* 2026-04-07 00:52:50.308627 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.308644 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308651 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.308659 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308673 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-07 00:52:50.308686 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308693 | orchestrator | 2026-04-07 00:52:50.308700 | orchestrator | TASK [mariadb : Wait for master mariadb] *************************************** 2026-04-07 00:52:50.308707 | orchestrator | Tuesday 07 April 2026 00:52:18 +0000 (0:00:02.024) 0:00:38.739 ********* 2026-04-07 00:52:50.308713 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308720 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308727 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308734 | orchestrator | 2026-04-07 00:52:50.308741 | orchestrator | TASK [service-check : mariadb | Get container facts] *************************** 2026-04-07 00:52:50.308748 | orchestrator | Tuesday 07 April 2026 00:52:19 +0000 (0:00:00.383) 0:00:39.123 ********* 2026-04-07 00:52:50.308755 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308762 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308769 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308775 | orchestrator | 2026-04-07 00:52:50.308782 | orchestrator | TASK [service-check : mariadb | Fail if containers are missing or not running] *** 2026-04-07 00:52:50.308789 | orchestrator | Tuesday 07 April 2026 00:52:19 +0000 (0:00:00.308) 0:00:39.431 ********* 2026-04-07 00:52:50.308796 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308803 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308809 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308816 | orchestrator | 2026-04-07 00:52:50.308823 | orchestrator | TASK [service-check : mariadb | Fail if containers are unhealthy] ************** 2026-04-07 00:52:50.308830 | orchestrator | Tuesday 07 April 2026 00:52:20 +0000 (0:00:00.502) 0:00:39.934 ********* 2026-04-07 00:52:50.308837 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308844 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308851 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308858 | orchestrator | 2026-04-07 00:52:50.308864 | orchestrator | TASK [mariadb : Wait for MariaDB service to be ready through VIP] ************** 2026-04-07 00:52:50.308871 | orchestrator | Tuesday 07 April 2026 00:52:20 +0000 (0:00:00.519) 0:00:40.453 ********* 2026-04-07 00:52:50.308878 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.308885 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.308891 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.308898 | orchestrator | 2026-04-07 00:52:50.308905 | orchestrator | TASK [mariadb : Create MariaDB volume] ***************************************** 2026-04-07 00:52:50.308912 | orchestrator | Tuesday 07 April 2026 00:52:20 +0000 (0:00:00.326) 0:00:40.780 ********* 2026-04-07 00:52:50.308918 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:52:50.308925 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:52:50.308932 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:52:50.308939 | orchestrator | 2026-04-07 00:52:50.308945 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB volume availability] ************* 2026-04-07 00:52:50.308953 | orchestrator | Tuesday 07 April 2026 00:52:21 +0000 (0:00:01.032) 0:00:41.812 ********* 2026-04-07 00:52:50.308960 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:52:50.308967 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:52:50.308974 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:52:50.308981 | orchestrator | 2026-04-07 00:52:50.308988 | orchestrator | TASK [mariadb : Establish whether the cluster has already existed] ************* 2026-04-07 00:52:50.308995 | orchestrator | Tuesday 07 April 2026 00:52:22 +0000 (0:00:00.307) 0:00:42.120 ********* 2026-04-07 00:52:50.309002 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:52:50.309014 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:52:50.309021 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:52:50.309028 | orchestrator | 2026-04-07 00:52:50.309035 | orchestrator | TASK [mariadb : Check MariaDB service port liveness] *************************** 2026-04-07 00:52:50.309041 | orchestrator | Tuesday 07 April 2026 00:52:22 +0000 (0:00:00.322) 0:00:42.442 ********* 2026-04-07 00:52:50.309049 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.10:3306"} 2026-04-07 00:52:50.309057 | orchestrator | ...ignoring 2026-04-07 00:52:50.309064 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.11:3306"} 2026-04-07 00:52:50.309071 | orchestrator | ...ignoring 2026-04-07 00:52:50.309079 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.12:3306"} 2026-04-07 00:52:50.309090 | orchestrator | ...ignoring 2026-04-07 00:52:50.309097 | orchestrator | 2026-04-07 00:52:50.309103 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service port liveness] *********** 2026-04-07 00:52:50.309110 | orchestrator | Tuesday 07 April 2026 00:52:33 +0000 (0:00:10.844) 0:00:53.286 ********* 2026-04-07 00:52:50.309117 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:52:50.309123 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:52:50.309130 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:52:50.309137 | orchestrator | 2026-04-07 00:52:50.309144 | orchestrator | TASK [mariadb : Fail on existing but stopped cluster] ************************** 2026-04-07 00:52:50.309150 | orchestrator | Tuesday 07 April 2026 00:52:33 +0000 (0:00:00.397) 0:00:53.684 ********* 2026-04-07 00:52:50.309157 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.309163 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309169 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309175 | orchestrator | 2026-04-07 00:52:50.309182 | orchestrator | TASK [mariadb : Check MariaDB service WSREP sync status] *********************** 2026-04-07 00:52:50.309188 | orchestrator | Tuesday 07 April 2026 00:52:34 +0000 (0:00:00.260) 0:00:53.945 ********* 2026-04-07 00:52:50.309195 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.309202 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309208 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309215 | orchestrator | 2026-04-07 00:52:50.309221 | orchestrator | TASK [mariadb : Extract MariaDB service WSREP sync status] ********************* 2026-04-07 00:52:50.309228 | orchestrator | Tuesday 07 April 2026 00:52:34 +0000 (0:00:00.287) 0:00:54.232 ********* 2026-04-07 00:52:50.309235 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.309247 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309255 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309261 | orchestrator | 2026-04-07 00:52:50.309268 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service WSREP sync status] ******* 2026-04-07 00:52:50.309275 | orchestrator | Tuesday 07 April 2026 00:52:34 +0000 (0:00:00.278) 0:00:54.510 ********* 2026-04-07 00:52:50.309282 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:52:50.309289 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:52:50.309295 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:52:50.309302 | orchestrator | 2026-04-07 00:52:50.309309 | orchestrator | TASK [mariadb : Fail when MariaDB services are not synced across the whole cluster] *** 2026-04-07 00:52:50.309316 | orchestrator | Tuesday 07 April 2026 00:52:34 +0000 (0:00:00.257) 0:00:54.767 ********* 2026-04-07 00:52:50.309322 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:52:50.309329 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309337 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309344 | orchestrator | 2026-04-07 00:52:50.309351 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-07 00:52:50.309358 | orchestrator | Tuesday 07 April 2026 00:52:35 +0000 (0:00:00.402) 0:00:55.170 ********* 2026-04-07 00:52:50.309365 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309372 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309384 | orchestrator | included: /ansible/roles/mariadb/tasks/bootstrap_cluster.yml for testbed-node-0 2026-04-07 00:52:50.309391 | orchestrator | 2026-04-07 00:52:50.309398 | orchestrator | TASK [mariadb : Running MariaDB bootstrap container] *************************** 2026-04-07 00:52:50.309405 | orchestrator | Tuesday 07 April 2026 00:52:35 +0000 (0:00:00.322) 0:00:55.493 ********* 2026-04-07 00:52:50.309417 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmariadb-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_ylfxxkmp/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_ylfxxkmp/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_ylfxxkmp/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmariadb-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:52:50.309426 | orchestrator | 2026-04-07 00:52:50.309433 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-07 00:52:50.309440 | orchestrator | Tuesday 07 April 2026 00:52:39 +0000 (0:00:03.723) 0:00:59.216 ********* 2026-04-07 00:52:50.309447 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309454 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309461 | orchestrator | 2026-04-07 00:52:50.309468 | orchestrator | RUNNING HANDLER [mariadb : Restart MariaDB on existing cluster members] ******** 2026-04-07 00:52:50.309474 | orchestrator | Tuesday 07 April 2026 00:52:39 +0000 (0:00:00.500) 0:00:59.716 ********* 2026-04-07 00:52:50.309481 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:52:50.309488 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:52:50.309494 | orchestrator | 2026-04-07 00:52:50.309501 | orchestrator | RUNNING HANDLER [mariadb : Start MariaDB on new nodes] ************************* 2026-04-07 00:52:50.309509 | orchestrator | Tuesday 07 April 2026 00:52:40 +0000 (0:00:00.192) 0:00:59.908 ********* 2026-04-07 00:52:50.309515 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:52:50.309522 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:52:50.309533 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: mariadb_restart 2026-04-07 00:52:50.309540 | orchestrator | 2026-04-07 00:52:50.309546 | orchestrator | PLAY [Restart mariadb services] ************************************************ 2026-04-07 00:52:50.309558 | orchestrator | skipping: no hosts matched 2026-04-07 00:52:50.309565 | orchestrator | 2026-04-07 00:52:50.309597 | orchestrator | PLAY [Start mariadb services] ************************************************** 2026-04-07 00:52:50.309604 | orchestrator | 2026-04-07 00:52:50.309611 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2026-04-07 00:52:50.309618 | orchestrator | Tuesday 07 April 2026 00:52:40 +0000 (0:00:00.215) 0:01:00.123 ********* 2026-04-07 00:52:50.309625 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmariadb-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_br13fneg/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_br13fneg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_br13fneg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_br13fneg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fmariadb-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 00:52:50.309633 | orchestrator | 2026-04-07 00:52:50.309640 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:52:50.309651 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-07 00:52:50.309658 | orchestrator | testbed-node-0 : ok=20  changed=9  unreachable=0 failed=1  skipped=33  rescued=0 ignored=1  2026-04-07 00:52:50.309666 | orchestrator | testbed-node-1 : ok=16  changed=7  unreachable=0 failed=1  skipped=38  rescued=0 ignored=1  2026-04-07 00:52:50.309672 | orchestrator | testbed-node-2 : ok=16  changed=7  unreachable=0 failed=0 skipped=38  rescued=0 ignored=1  2026-04-07 00:52:50.309681 | orchestrator | 2026-04-07 00:52:50.309688 | orchestrator | 2026-04-07 00:52:50.309694 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:52:50.309701 | orchestrator | Tuesday 07 April 2026 00:52:48 +0000 (0:00:08.037) 0:01:08.161 ********* 2026-04-07 00:52:50.309713 | orchestrator | =============================================================================== 2026-04-07 00:52:50.309720 | orchestrator | mariadb : Check MariaDB service port liveness -------------------------- 10.84s 2026-04-07 00:52:50.309727 | orchestrator | mariadb : Restart MariaDB container ------------------------------------- 8.04s 2026-04-07 00:52:50.309739 | orchestrator | mariadb : Running MariaDB bootstrap container --------------------------- 3.72s 2026-04-07 00:52:50.309746 | orchestrator | mariadb : Copying over config.json files for services ------------------- 3.63s 2026-04-07 00:52:50.309753 | orchestrator | mariadb : Copying over galera.cnf --------------------------------------- 3.30s 2026-04-07 00:52:50.309760 | orchestrator | Check MariaDB service --------------------------------------------------- 2.82s 2026-04-07 00:52:50.309767 | orchestrator | service-check-containers : mariadb | Check containers ------------------- 2.52s 2026-04-07 00:52:50.309773 | orchestrator | mariadb : Ensuring config directories exist ----------------------------- 2.50s 2026-04-07 00:52:50.309780 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS certificate --- 2.29s 2026-04-07 00:52:50.309787 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 2.16s 2026-04-07 00:52:50.309794 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS key ----- 2.12s 2026-04-07 00:52:50.309801 | orchestrator | mariadb : Restart slave MariaDB container(s) ---------------------------- 2.03s 2026-04-07 00:52:50.309808 | orchestrator | mariadb : Restart master MariaDB container(s) --------------------------- 2.02s 2026-04-07 00:52:50.309814 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.94s 2026-04-07 00:52:50.309821 | orchestrator | mariadb : Copying over config.json files for mariabackup ---------------- 1.33s 2026-04-07 00:52:50.309827 | orchestrator | mariadb : Copying over my.cnf for mariabackup --------------------------- 1.25s 2026-04-07 00:52:50.309834 | orchestrator | mariadb : Create MariaDB volume ----------------------------------------- 1.03s 2026-04-07 00:52:50.309841 | orchestrator | mariadb : Ensuring database backup config directory exists -------------- 0.65s 2026-04-07 00:52:50.309848 | orchestrator | mariadb : include_tasks ------------------------------------------------- 0.58s 2026-04-07 00:52:50.309855 | orchestrator | service-check : mariadb | Fail if containers are unhealthy -------------- 0.52s 2026-04-07 00:52:50.309861 | orchestrator | 2026-04-07 00:52:50 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:52:50.309869 | orchestrator | 2026-04-07 00:52:50 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:52:50.309876 | orchestrator | 2026-04-07 00:52:50 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:50.309883 | orchestrator | 2026-04-07 00:52:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:53.335416 | orchestrator | 2026-04-07 00:52:53 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:52:53.335720 | orchestrator | 2026-04-07 00:52:53 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:52:53.336284 | orchestrator | 2026-04-07 00:52:53 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:53.336315 | orchestrator | 2026-04-07 00:52:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:56.367749 | orchestrator | 2026-04-07 00:52:56 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:52:56.368554 | orchestrator | 2026-04-07 00:52:56 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:52:56.369135 | orchestrator | 2026-04-07 00:52:56 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:56.369178 | orchestrator | 2026-04-07 00:52:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:52:59.408348 | orchestrator | 2026-04-07 00:52:59 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:52:59.409493 | orchestrator | 2026-04-07 00:52:59 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:52:59.411379 | orchestrator | 2026-04-07 00:52:59 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:52:59.411430 | orchestrator | 2026-04-07 00:52:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:02.442537 | orchestrator | 2026-04-07 00:53:02 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:02.442590 | orchestrator | 2026-04-07 00:53:02 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:53:02.443992 | orchestrator | 2026-04-07 00:53:02 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:02.444043 | orchestrator | 2026-04-07 00:53:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:05.472440 | orchestrator | 2026-04-07 00:53:05 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:05.475733 | orchestrator | 2026-04-07 00:53:05 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:53:05.476989 | orchestrator | 2026-04-07 00:53:05 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:05.477052 | orchestrator | 2026-04-07 00:53:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:08.506891 | orchestrator | 2026-04-07 00:53:08 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:08.508542 | orchestrator | 2026-04-07 00:53:08 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:53:08.509293 | orchestrator | 2026-04-07 00:53:08 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:08.509371 | orchestrator | 2026-04-07 00:53:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:11.544378 | orchestrator | 2026-04-07 00:53:11 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:11.544704 | orchestrator | 2026-04-07 00:53:11 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:53:11.545496 | orchestrator | 2026-04-07 00:53:11 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:11.545526 | orchestrator | 2026-04-07 00:53:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:14.580160 | orchestrator | 2026-04-07 00:53:14 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:14.580304 | orchestrator | 2026-04-07 00:53:14 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:53:14.580988 | orchestrator | 2026-04-07 00:53:14 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:14.581029 | orchestrator | 2026-04-07 00:53:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:17.610298 | orchestrator | 2026-04-07 00:53:17 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:17.610360 | orchestrator | 2026-04-07 00:53:17 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state STARTED 2026-04-07 00:53:17.611252 | orchestrator | 2026-04-07 00:53:17 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:17.611281 | orchestrator | 2026-04-07 00:53:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:20.644257 | orchestrator | 2026-04-07 00:53:20 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:20.647389 | orchestrator | 2026-04-07 00:53:20 | INFO  | Task 8169e4df-b468-4230-9d0f-f55a0340848d is in state SUCCESS 2026-04-07 00:53:20.648488 | orchestrator | 2026-04-07 00:53:20.648553 | orchestrator | 2026-04-07 00:53:20.648563 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:53:20.648571 | orchestrator | 2026-04-07 00:53:20.648578 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:53:20.648585 | orchestrator | Tuesday 07 April 2026 00:52:50 +0000 (0:00:00.224) 0:00:00.224 ********* 2026-04-07 00:53:20.648592 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.648601 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.648608 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.648616 | orchestrator | 2026-04-07 00:53:20.648623 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:53:20.648630 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.194) 0:00:00.419 ********* 2026-04-07 00:53:20.648636 | orchestrator | ok: [testbed-node-0] => (item=enable_horizon_True) 2026-04-07 00:53:20.648668 | orchestrator | ok: [testbed-node-1] => (item=enable_horizon_True) 2026-04-07 00:53:20.648676 | orchestrator | ok: [testbed-node-2] => (item=enable_horizon_True) 2026-04-07 00:53:20.648683 | orchestrator | 2026-04-07 00:53:20.648689 | orchestrator | PLAY [Apply role horizon] ****************************************************** 2026-04-07 00:53:20.648696 | orchestrator | 2026-04-07 00:53:20.648703 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-07 00:53:20.648709 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.252) 0:00:00.671 ********* 2026-04-07 00:53:20.648726 | orchestrator | included: /ansible/roles/horizon/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:53:20.648734 | orchestrator | 2026-04-07 00:53:20.648740 | orchestrator | TASK [horizon : Ensuring config directories exist] ***************************** 2026-04-07 00:53:20.648747 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.533) 0:00:01.204 ********* 2026-04-07 00:53:20.648757 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.648794 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.648804 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.648816 | orchestrator | 2026-04-07 00:53:20.649040 | orchestrator | TASK [horizon : Set empty custom policy] *************************************** 2026-04-07 00:53:20.649048 | orchestrator | Tuesday 07 April 2026 00:52:53 +0000 (0:00:01.591) 0:00:02.796 ********* 2026-04-07 00:53:20.649055 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649061 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649068 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649074 | orchestrator | 2026-04-07 00:53:20.649086 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-07 00:53:20.649093 | orchestrator | Tuesday 07 April 2026 00:52:53 +0000 (0:00:00.225) 0:00:03.022 ********* 2026-04-07 00:53:20.649100 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'cloudkitty', 'enabled': False})  2026-04-07 00:53:20.649107 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'heat', 'enabled': 'no'})  2026-04-07 00:53:20.649113 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'ironic', 'enabled': False})  2026-04-07 00:53:20.649120 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'masakari', 'enabled': False})  2026-04-07 00:53:20.649126 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'mistral', 'enabled': False})  2026-04-07 00:53:20.649131 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'tacker', 'enabled': False})  2026-04-07 00:53:20.649137 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'trove', 'enabled': False})  2026-04-07 00:53:20.649143 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'watcher', 'enabled': False})  2026-04-07 00:53:20.649149 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'cloudkitty', 'enabled': False})  2026-04-07 00:53:20.649160 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'heat', 'enabled': 'no'})  2026-04-07 00:53:20.649167 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'ironic', 'enabled': False})  2026-04-07 00:53:20.649173 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'masakari', 'enabled': False})  2026-04-07 00:53:20.649180 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'mistral', 'enabled': False})  2026-04-07 00:53:20.649186 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'tacker', 'enabled': False})  2026-04-07 00:53:20.649193 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'trove', 'enabled': False})  2026-04-07 00:53:20.649199 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'watcher', 'enabled': False})  2026-04-07 00:53:20.649206 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'cloudkitty', 'enabled': False})  2026-04-07 00:53:20.649212 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'heat', 'enabled': 'no'})  2026-04-07 00:53:20.649218 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'ironic', 'enabled': False})  2026-04-07 00:53:20.649224 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'masakari', 'enabled': False})  2026-04-07 00:53:20.649231 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'mistral', 'enabled': False})  2026-04-07 00:53:20.649237 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'tacker', 'enabled': False})  2026-04-07 00:53:20.649244 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'trove', 'enabled': False})  2026-04-07 00:53:20.649251 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'watcher', 'enabled': False})  2026-04-07 00:53:20.649266 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'ceilometer', 'enabled': 'yes'}) 2026-04-07 00:53:20.649274 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'cinder', 'enabled': 'yes'}) 2026-04-07 00:53:20.649282 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'designate', 'enabled': True}) 2026-04-07 00:53:20.649289 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'glance', 'enabled': True}) 2026-04-07 00:53:20.649295 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'keystone', 'enabled': True}) 2026-04-07 00:53:20.649302 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'magnum', 'enabled': True}) 2026-04-07 00:53:20.649309 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'manila', 'enabled': True}) 2026-04-07 00:53:20.649316 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'neutron', 'enabled': True}) 2026-04-07 00:53:20.649323 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'nova', 'enabled': True}) 2026-04-07 00:53:20.649331 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'octavia', 'enabled': True}) 2026-04-07 00:53:20.649337 | orchestrator | 2026-04-07 00:53:20.649344 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.649351 | orchestrator | Tuesday 07 April 2026 00:52:54 +0000 (0:00:00.645) 0:00:03.668 ********* 2026-04-07 00:53:20.649357 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649363 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649374 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649381 | orchestrator | 2026-04-07 00:53:20.649387 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.649394 | orchestrator | Tuesday 07 April 2026 00:52:54 +0000 (0:00:00.411) 0:00:04.079 ********* 2026-04-07 00:53:20.649401 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649408 | orchestrator | 2026-04-07 00:53:20.649415 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.649421 | orchestrator | Tuesday 07 April 2026 00:52:54 +0000 (0:00:00.107) 0:00:04.187 ********* 2026-04-07 00:53:20.649428 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649435 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.649442 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.649448 | orchestrator | 2026-04-07 00:53:20.649455 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.649462 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.237) 0:00:04.424 ********* 2026-04-07 00:53:20.649469 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649476 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649482 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649488 | orchestrator | 2026-04-07 00:53:20.649494 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.649500 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.317) 0:00:04.742 ********* 2026-04-07 00:53:20.649509 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649516 | orchestrator | 2026-04-07 00:53:20.649522 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.649535 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.124) 0:00:04.867 ********* 2026-04-07 00:53:20.649549 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649556 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.649562 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.649568 | orchestrator | 2026-04-07 00:53:20.649575 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.649582 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.353) 0:00:05.220 ********* 2026-04-07 00:53:20.649588 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649595 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649602 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649609 | orchestrator | 2026-04-07 00:53:20.649616 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.649623 | orchestrator | Tuesday 07 April 2026 00:52:56 +0000 (0:00:00.256) 0:00:05.477 ********* 2026-04-07 00:53:20.649630 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649636 | orchestrator | 2026-04-07 00:53:20.649681 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.649690 | orchestrator | Tuesday 07 April 2026 00:52:56 +0000 (0:00:00.092) 0:00:05.569 ********* 2026-04-07 00:53:20.649697 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649704 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.649711 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.649717 | orchestrator | 2026-04-07 00:53:20.649725 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.649732 | orchestrator | Tuesday 07 April 2026 00:52:56 +0000 (0:00:00.233) 0:00:05.803 ********* 2026-04-07 00:53:20.649739 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649745 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649752 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649759 | orchestrator | 2026-04-07 00:53:20.649765 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.649772 | orchestrator | Tuesday 07 April 2026 00:52:56 +0000 (0:00:00.413) 0:00:06.216 ********* 2026-04-07 00:53:20.649779 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649786 | orchestrator | 2026-04-07 00:53:20.649792 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.649799 | orchestrator | Tuesday 07 April 2026 00:52:57 +0000 (0:00:00.107) 0:00:06.324 ********* 2026-04-07 00:53:20.649806 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649813 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.649820 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.649827 | orchestrator | 2026-04-07 00:53:20.649834 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.649840 | orchestrator | Tuesday 07 April 2026 00:52:57 +0000 (0:00:00.477) 0:00:06.802 ********* 2026-04-07 00:53:20.649847 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649854 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649861 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649868 | orchestrator | 2026-04-07 00:53:20.649874 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.649880 | orchestrator | Tuesday 07 April 2026 00:52:57 +0000 (0:00:00.293) 0:00:07.095 ********* 2026-04-07 00:53:20.649887 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649893 | orchestrator | 2026-04-07 00:53:20.649899 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.649905 | orchestrator | Tuesday 07 April 2026 00:52:57 +0000 (0:00:00.129) 0:00:07.224 ********* 2026-04-07 00:53:20.649912 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649918 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.649924 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.649931 | orchestrator | 2026-04-07 00:53:20.649937 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.649943 | orchestrator | Tuesday 07 April 2026 00:52:58 +0000 (0:00:00.273) 0:00:07.498 ********* 2026-04-07 00:53:20.649955 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.649962 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.649968 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.649974 | orchestrator | 2026-04-07 00:53:20.649981 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.649987 | orchestrator | Tuesday 07 April 2026 00:52:58 +0000 (0:00:00.284) 0:00:07.782 ********* 2026-04-07 00:53:20.649993 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.649999 | orchestrator | 2026-04-07 00:53:20.650006 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.650050 | orchestrator | Tuesday 07 April 2026 00:52:58 +0000 (0:00:00.102) 0:00:07.885 ********* 2026-04-07 00:53:20.650064 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650072 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650079 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650086 | orchestrator | 2026-04-07 00:53:20.650093 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.650100 | orchestrator | Tuesday 07 April 2026 00:52:59 +0000 (0:00:00.440) 0:00:08.326 ********* 2026-04-07 00:53:20.650107 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.650114 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.650121 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.650128 | orchestrator | 2026-04-07 00:53:20.650135 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.650142 | orchestrator | Tuesday 07 April 2026 00:52:59 +0000 (0:00:00.290) 0:00:08.616 ********* 2026-04-07 00:53:20.650149 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650156 | orchestrator | 2026-04-07 00:53:20.650163 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.650170 | orchestrator | Tuesday 07 April 2026 00:52:59 +0000 (0:00:00.120) 0:00:08.737 ********* 2026-04-07 00:53:20.650177 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650184 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650191 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650198 | orchestrator | 2026-04-07 00:53:20.650205 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.650216 | orchestrator | Tuesday 07 April 2026 00:52:59 +0000 (0:00:00.271) 0:00:09.009 ********* 2026-04-07 00:53:20.650223 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.650230 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.650237 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.650244 | orchestrator | 2026-04-07 00:53:20.650251 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.650258 | orchestrator | Tuesday 07 April 2026 00:53:00 +0000 (0:00:00.307) 0:00:09.316 ********* 2026-04-07 00:53:20.650265 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650272 | orchestrator | 2026-04-07 00:53:20.650279 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.650285 | orchestrator | Tuesday 07 April 2026 00:53:00 +0000 (0:00:00.331) 0:00:09.648 ********* 2026-04-07 00:53:20.650292 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650299 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650306 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650313 | orchestrator | 2026-04-07 00:53:20.650320 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.650327 | orchestrator | Tuesday 07 April 2026 00:53:00 +0000 (0:00:00.339) 0:00:09.988 ********* 2026-04-07 00:53:20.650334 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.650341 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.650348 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.650355 | orchestrator | 2026-04-07 00:53:20.650362 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.650369 | orchestrator | Tuesday 07 April 2026 00:53:01 +0000 (0:00:00.352) 0:00:10.340 ********* 2026-04-07 00:53:20.650376 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650388 | orchestrator | 2026-04-07 00:53:20.650395 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.650401 | orchestrator | Tuesday 07 April 2026 00:53:01 +0000 (0:00:00.119) 0:00:10.460 ********* 2026-04-07 00:53:20.650408 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650416 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650422 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650429 | orchestrator | 2026-04-07 00:53:20.650436 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-07 00:53:20.650444 | orchestrator | Tuesday 07 April 2026 00:53:01 +0000 (0:00:00.262) 0:00:10.722 ********* 2026-04-07 00:53:20.650452 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:20.650460 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:20.650466 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:20.650472 | orchestrator | 2026-04-07 00:53:20.650478 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-07 00:53:20.650484 | orchestrator | Tuesday 07 April 2026 00:53:02 +0000 (0:00:00.649) 0:00:11.371 ********* 2026-04-07 00:53:20.650491 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650497 | orchestrator | 2026-04-07 00:53:20.650504 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-07 00:53:20.650510 | orchestrator | Tuesday 07 April 2026 00:53:02 +0000 (0:00:00.123) 0:00:11.495 ********* 2026-04-07 00:53:20.650516 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650523 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650529 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650535 | orchestrator | 2026-04-07 00:53:20.650541 | orchestrator | TASK [horizon : Copying over config.json files for services] ******************* 2026-04-07 00:53:20.650548 | orchestrator | Tuesday 07 April 2026 00:53:02 +0000 (0:00:00.306) 0:00:11.801 ********* 2026-04-07 00:53:20.650555 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:53:20.650562 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:53:20.650568 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:53:20.650575 | orchestrator | 2026-04-07 00:53:20.650582 | orchestrator | TASK [horizon : Copying over horizon.conf] ************************************* 2026-04-07 00:53:20.650589 | orchestrator | Tuesday 07 April 2026 00:53:04 +0000 (0:00:01.687) 0:00:13.489 ********* 2026-04-07 00:53:20.650596 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-04-07 00:53:20.650603 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-04-07 00:53:20.650609 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-04-07 00:53:20.650616 | orchestrator | 2026-04-07 00:53:20.650623 | orchestrator | TASK [horizon : Copying over kolla-settings.py] ******************************** 2026-04-07 00:53:20.650629 | orchestrator | Tuesday 07 April 2026 00:53:06 +0000 (0:00:01.965) 0:00:15.455 ********* 2026-04-07 00:53:20.650635 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-04-07 00:53:20.650664 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-04-07 00:53:20.650671 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-04-07 00:53:20.650678 | orchestrator | 2026-04-07 00:53:20.650685 | orchestrator | TASK [horizon : Copying over custom-settings.py] ******************************* 2026-04-07 00:53:20.650691 | orchestrator | Tuesday 07 April 2026 00:53:09 +0000 (0:00:02.896) 0:00:18.351 ********* 2026-04-07 00:53:20.650698 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-04-07 00:53:20.650705 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-04-07 00:53:20.650712 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-04-07 00:53:20.650727 | orchestrator | 2026-04-07 00:53:20.650735 | orchestrator | TASK [horizon : Copying over existing policy file] ***************************** 2026-04-07 00:53:20.650742 | orchestrator | Tuesday 07 April 2026 00:53:10 +0000 (0:00:01.598) 0:00:19.950 ********* 2026-04-07 00:53:20.650749 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650755 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650762 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650769 | orchestrator | 2026-04-07 00:53:20.650780 | orchestrator | TASK [horizon : Copying over custom themes] ************************************ 2026-04-07 00:53:20.650787 | orchestrator | Tuesday 07 April 2026 00:53:10 +0000 (0:00:00.242) 0:00:20.192 ********* 2026-04-07 00:53:20.650794 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650800 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650807 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650813 | orchestrator | 2026-04-07 00:53:20.650820 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-07 00:53:20.650827 | orchestrator | Tuesday 07 April 2026 00:53:11 +0000 (0:00:00.287) 0:00:20.480 ********* 2026-04-07 00:53:20.650833 | orchestrator | included: /ansible/roles/horizon/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:53:20.650840 | orchestrator | 2026-04-07 00:53:20.650847 | orchestrator | TASK [service-cert-copy : horizon | Copying over extra CA certificates] ******** 2026-04-07 00:53:20.650854 | orchestrator | Tuesday 07 April 2026 00:53:11 +0000 (0:00:00.637) 0:00:21.118 ********* 2026-04-07 00:53:20.650863 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.650884 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.650901 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.650912 | orchestrator | 2026-04-07 00:53:20.650919 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS certificate] *** 2026-04-07 00:53:20.650926 | orchestrator | Tuesday 07 April 2026 00:53:13 +0000 (0:00:01.399) 0:00:22.518 ********* 2026-04-07 00:53:20.650936 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.650944 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.650956 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.650967 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.650977 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.650985 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.650991 | orchestrator | 2026-04-07 00:53:20.650998 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS key] ***** 2026-04-07 00:53:20.651004 | orchestrator | Tuesday 07 April 2026 00:53:13 +0000 (0:00:00.648) 0:00:23.167 ********* 2026-04-07 00:53:20.651019 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.651033 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.651041 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.651048 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.651068 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.651076 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.651082 | orchestrator | 2026-04-07 00:53:20.651089 | orchestrator | TASK [service-check-containers : horizon | Check containers] ******************* 2026-04-07 00:53:20.651096 | orchestrator | Tuesday 07 April 2026 00:53:14 +0000 (0:00:00.942) 0:00:24.110 ********* 2026-04-07 00:53:20.651105 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.651122 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.651136 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-07 00:53:20.651148 | orchestrator | 2026-04-07 00:53:20.651155 | orchestrator | TASK [service-check-containers : horizon | Notify handlers to restart containers] *** 2026-04-07 00:53:20.651162 | orchestrator | Tuesday 07 April 2026 00:53:16 +0000 (0:00:01.289) 0:00:25.400 ********* 2026-04-07 00:53:20.651168 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:53:20.651175 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:53:20.651181 | orchestrator | } 2026-04-07 00:53:20.651188 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:53:20.651195 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:53:20.651201 | orchestrator | } 2026-04-07 00:53:20.651208 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:53:20.651215 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:53:20.651221 | orchestrator | } 2026-04-07 00:53:20.651228 | orchestrator | 2026-04-07 00:53:20.651241 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:53:20.651248 | orchestrator | Tuesday 07 April 2026 00:53:16 +0000 (0:00:00.343) 0:00:25.744 ********* 2026-04-07 00:53:20.651256 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.651268 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.651284 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.651292 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.651299 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-07 00:53:20.651311 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.651318 | orchestrator | 2026-04-07 00:53:20.651325 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-07 00:53:20.651332 | orchestrator | Tuesday 07 April 2026 00:53:17 +0000 (0:00:01.068) 0:00:26.812 ********* 2026-04-07 00:53:20.651339 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:20.651346 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:20.651352 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:20.651359 | orchestrator | 2026-04-07 00:53:20.651369 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-07 00:53:20.651376 | orchestrator | Tuesday 07 April 2026 00:53:17 +0000 (0:00:00.254) 0:00:27.067 ********* 2026-04-07 00:53:20.651384 | orchestrator | included: /ansible/roles/horizon/tasks/bootstrap.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:53:20.651391 | orchestrator | 2026-04-07 00:53:20.651397 | orchestrator | TASK [horizon : Creating Horizon database] ************************************* 2026-04-07 00:53:20.651404 | orchestrator | Tuesday 07 April 2026 00:53:18 +0000 (0:00:00.722) 0:00:27.790 ********* 2026-04-07 00:53:20.651411 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:53:20.651418 | orchestrator | 2026-04-07 00:53:20.651425 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:53:20.651433 | orchestrator | testbed-node-0 : ok=34  changed=8  unreachable=0 failed=1  skipped=26  rescued=0 ignored=0 2026-04-07 00:53:20.651444 | orchestrator | testbed-node-1 : ok=34  changed=8  unreachable=0 failed=0 skipped=16  rescued=0 ignored=0 2026-04-07 00:53:20.651451 | orchestrator | testbed-node-2 : ok=34  changed=8  unreachable=0 failed=0 skipped=16  rescued=0 ignored=0 2026-04-07 00:53:20.651457 | orchestrator | 2026-04-07 00:53:20.651464 | orchestrator | 2026-04-07 00:53:20.651471 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:53:20.651478 | orchestrator | Tuesday 07 April 2026 00:53:19 +0000 (0:00:00.791) 0:00:28.581 ********* 2026-04-07 00:53:20.651484 | orchestrator | =============================================================================== 2026-04-07 00:53:20.651490 | orchestrator | horizon : Copying over kolla-settings.py -------------------------------- 2.90s 2026-04-07 00:53:20.651496 | orchestrator | horizon : Copying over horizon.conf ------------------------------------- 1.97s 2026-04-07 00:53:20.651503 | orchestrator | horizon : Copying over config.json files for services ------------------- 1.69s 2026-04-07 00:53:20.651510 | orchestrator | horizon : Copying over custom-settings.py ------------------------------- 1.60s 2026-04-07 00:53:20.651517 | orchestrator | horizon : Ensuring config directories exist ----------------------------- 1.59s 2026-04-07 00:53:20.651523 | orchestrator | service-cert-copy : horizon | Copying over extra CA certificates -------- 1.40s 2026-04-07 00:53:20.651529 | orchestrator | service-check-containers : horizon | Check containers ------------------- 1.29s 2026-04-07 00:53:20.651540 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.07s 2026-04-07 00:53:20.651547 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS key ----- 0.94s 2026-04-07 00:53:20.651553 | orchestrator | horizon : Creating Horizon database ------------------------------------- 0.79s 2026-04-07 00:53:20.651560 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.72s 2026-04-07 00:53:20.651566 | orchestrator | horizon : Update policy file name --------------------------------------- 0.65s 2026-04-07 00:53:20.651573 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS certificate --- 0.65s 2026-04-07 00:53:20.651580 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.65s 2026-04-07 00:53:20.651587 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.64s 2026-04-07 00:53:20.651593 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.53s 2026-04-07 00:53:20.651599 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.48s 2026-04-07 00:53:20.651606 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.44s 2026-04-07 00:53:20.651612 | orchestrator | horizon : Update policy file name --------------------------------------- 0.41s 2026-04-07 00:53:20.651619 | orchestrator | horizon : Update policy file name --------------------------------------- 0.41s 2026-04-07 00:53:20.651625 | orchestrator | 2026-04-07 00:53:20 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:20.651632 | orchestrator | 2026-04-07 00:53:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:23.682443 | orchestrator | 2026-04-07 00:53:23 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:23.685261 | orchestrator | 2026-04-07 00:53:23 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:23.685347 | orchestrator | 2026-04-07 00:53:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:26.725415 | orchestrator | 2026-04-07 00:53:26 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:26.727485 | orchestrator | 2026-04-07 00:53:26 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:26.727555 | orchestrator | 2026-04-07 00:53:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:29.765774 | orchestrator | 2026-04-07 00:53:29 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:29.767081 | orchestrator | 2026-04-07 00:53:29 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:29.767123 | orchestrator | 2026-04-07 00:53:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:32.799506 | orchestrator | 2026-04-07 00:53:32 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state STARTED 2026-04-07 00:53:32.799713 | orchestrator | 2026-04-07 00:53:32 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:32.801025 | orchestrator | 2026-04-07 00:53:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:35.829554 | orchestrator | 2026-04-07 00:53:35 | INFO  | Task ba6d3bde-a27b-4d05-adbc-7e043f78c4d8 is in state SUCCESS 2026-04-07 00:53:35.830664 | orchestrator | 2026-04-07 00:53:35.830766 | orchestrator | 2026-04-07 00:53:35.830782 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:53:35.830791 | orchestrator | 2026-04-07 00:53:35.830798 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:53:35.830913 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.272) 0:00:00.272 ********* 2026-04-07 00:53:35.830931 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:35.830941 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:35.830983 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:35.830993 | orchestrator | 2026-04-07 00:53:35.831018 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:53:35.831306 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.245) 0:00:00.517 ********* 2026-04-07 00:53:35.831323 | orchestrator | ok: [testbed-node-0] => (item=enable_keystone_True) 2026-04-07 00:53:35.831335 | orchestrator | ok: [testbed-node-1] => (item=enable_keystone_True) 2026-04-07 00:53:35.831345 | orchestrator | ok: [testbed-node-2] => (item=enable_keystone_True) 2026-04-07 00:53:35.831357 | orchestrator | 2026-04-07 00:53:35.831364 | orchestrator | PLAY [Apply role keystone] ***************************************************** 2026-04-07 00:53:35.831371 | orchestrator | 2026-04-07 00:53:35.831377 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-07 00:53:35.831387 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.271) 0:00:00.789 ********* 2026-04-07 00:53:35.831397 | orchestrator | included: /ansible/roles/keystone/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:53:35.831408 | orchestrator | 2026-04-07 00:53:35.831418 | orchestrator | TASK [keystone : Ensuring config directories exist] **************************** 2026-04-07 00:53:35.831429 | orchestrator | Tuesday 07 April 2026 00:52:52 +0000 (0:00:00.515) 0:00:01.305 ********* 2026-04-07 00:53:35.831446 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.831461 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.831503 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.831532 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.831542 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.831552 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.831561 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.831571 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.831589 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.831606 | orchestrator | 2026-04-07 00:53:35.831616 | orchestrator | TASK [keystone : Check if policies shall be overwritten] *********************** 2026-04-07 00:53:35.831847 | orchestrator | Tuesday 07 April 2026 00:52:54 +0000 (0:00:02.291) 0:00:03.596 ********* 2026-04-07 00:53:35.831878 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.831892 | orchestrator | 2026-04-07 00:53:35.831902 | orchestrator | TASK [keystone : Set keystone policy file] ************************************* 2026-04-07 00:53:35.831921 | orchestrator | Tuesday 07 April 2026 00:52:54 +0000 (0:00:00.087) 0:00:03.684 ********* 2026-04-07 00:53:35.831932 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.831942 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.831952 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.831962 | orchestrator | 2026-04-07 00:53:35.831973 | orchestrator | TASK [keystone : Check if Keystone domain-specific config is supplied] ********* 2026-04-07 00:53:35.831983 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.223) 0:00:03.907 ********* 2026-04-07 00:53:35.831995 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:53:35.832002 | orchestrator | 2026-04-07 00:53:35.832009 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-07 00:53:35.832015 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.836) 0:00:04.744 ********* 2026-04-07 00:53:35.832022 | orchestrator | included: /ansible/roles/keystone/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:53:35.832028 | orchestrator | 2026-04-07 00:53:35.832035 | orchestrator | TASK [service-cert-copy : keystone | Copying over extra CA certificates] ******* 2026-04-07 00:53:35.832041 | orchestrator | Tuesday 07 April 2026 00:52:56 +0000 (0:00:00.598) 0:00:05.342 ********* 2026-04-07 00:53:35.832049 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832058 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832084 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832096 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832103 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832110 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832116 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832135 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832141 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832153 | orchestrator | 2026-04-07 00:53:35.832159 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS certificate] *** 2026-04-07 00:53:35.832168 | orchestrator | Tuesday 07 April 2026 00:52:59 +0000 (0:00:03.343) 0:00:08.685 ********* 2026-04-07 00:53:35.832183 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.832195 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832206 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.832216 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.832227 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.832254 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832265 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.832275 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.832286 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.832298 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832315 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.832326 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.832336 | orchestrator | 2026-04-07 00:53:35.832347 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS key] **** 2026-04-07 00:53:35.832358 | orchestrator | Tuesday 07 April 2026 00:53:00 +0000 (0:00:00.665) 0:00:09.351 ********* 2026-04-07 00:53:35.832412 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.832427 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832438 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.832449 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.832460 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.832485 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832498 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.832509 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.832535 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.832548 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832558 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.832582 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.832593 | orchestrator | 2026-04-07 00:53:35.832605 | orchestrator | TASK [keystone : Copying over config.json files for services] ****************** 2026-04-07 00:53:35.832615 | orchestrator | Tuesday 07 April 2026 00:53:01 +0000 (0:00:00.950) 0:00:10.302 ********* 2026-04-07 00:53:35.832625 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832648 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832661 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832673 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832713 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832725 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832743 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832757 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832765 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832771 | orchestrator | 2026-04-07 00:53:35.832778 | orchestrator | TASK [keystone : Copying over keystone.conf] *********************************** 2026-04-07 00:53:35.832784 | orchestrator | Tuesday 07 April 2026 00:53:04 +0000 (0:00:03.213) 0:00:13.516 ********* 2026-04-07 00:53:35.832796 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832803 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832816 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832827 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832834 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.832846 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.832856 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832866 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832886 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.832901 | orchestrator | 2026-04-07 00:53:35.832919 | orchestrator | TASK [keystone : Copying keystone-startup script for keystone] ***************** 2026-04-07 00:53:35.832928 | orchestrator | Tuesday 07 April 2026 00:53:10 +0000 (0:00:05.956) 0:00:19.473 ********* 2026-04-07 00:53:35.832939 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:53:35.832949 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:53:35.832958 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:53:35.832967 | orchestrator | 2026-04-07 00:53:35.832976 | orchestrator | TASK [keystone : Create Keystone domain-specific config directory] ************* 2026-04-07 00:53:35.832985 | orchestrator | Tuesday 07 April 2026 00:53:12 +0000 (0:00:01.446) 0:00:20.919 ********* 2026-04-07 00:53:35.832995 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.833005 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.833015 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.833032 | orchestrator | 2026-04-07 00:53:35.833043 | orchestrator | TASK [keystone : Get file list in custom domains folder] *********************** 2026-04-07 00:53:35.833054 | orchestrator | Tuesday 07 April 2026 00:53:12 +0000 (0:00:00.619) 0:00:21.538 ********* 2026-04-07 00:53:35.833064 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.833075 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.833083 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.833090 | orchestrator | 2026-04-07 00:53:35.833096 | orchestrator | TASK [keystone : Copying Keystone Domain specific settings] ******************** 2026-04-07 00:53:35.833102 | orchestrator | Tuesday 07 April 2026 00:53:13 +0000 (0:00:00.407) 0:00:21.946 ********* 2026-04-07 00:53:35.833108 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.833115 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.833121 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.833127 | orchestrator | 2026-04-07 00:53:35.833134 | orchestrator | TASK [keystone : Copying over existing policy file] **************************** 2026-04-07 00:53:35.833140 | orchestrator | Tuesday 07 April 2026 00:53:13 +0000 (0:00:00.289) 0:00:22.236 ********* 2026-04-07 00:53:35.833147 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.833154 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.833161 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.833167 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.833184 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.833197 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.833204 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.833211 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.833218 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.833225 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.833239 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.833250 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.833257 | orchestrator | 2026-04-07 00:53:35.833264 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-07 00:53:35.833270 | orchestrator | Tuesday 07 April 2026 00:53:13 +0000 (0:00:00.582) 0:00:22.819 ********* 2026-04-07 00:53:35.833276 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.833283 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.833289 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.833295 | orchestrator | 2026-04-07 00:53:35.833301 | orchestrator | TASK [keystone : Copying over wsgi-keystone.conf] ****************************** 2026-04-07 00:53:35.833308 | orchestrator | Tuesday 07 April 2026 00:53:14 +0000 (0:00:00.251) 0:00:23.071 ********* 2026-04-07 00:53:35.833314 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-04-07 00:53:35.833322 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-04-07 00:53:35.833328 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-04-07 00:53:35.833334 | orchestrator | 2026-04-07 00:53:35.833341 | orchestrator | TASK [keystone : Checking whether keystone-paste.ini file exists] ************** 2026-04-07 00:53:35.833347 | orchestrator | Tuesday 07 April 2026 00:53:15 +0000 (0:00:01.599) 0:00:24.671 ********* 2026-04-07 00:53:35.833353 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:53:35.833360 | orchestrator | 2026-04-07 00:53:35.833366 | orchestrator | TASK [keystone : Copying over keystone-paste.ini] ****************************** 2026-04-07 00:53:35.833372 | orchestrator | Tuesday 07 April 2026 00:53:16 +0000 (0:00:01.081) 0:00:25.752 ********* 2026-04-07 00:53:35.833378 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.833385 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.833391 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.833397 | orchestrator | 2026-04-07 00:53:35.833404 | orchestrator | TASK [keystone : Generate the required cron jobs for the node] ***************** 2026-04-07 00:53:35.833410 | orchestrator | Tuesday 07 April 2026 00:53:17 +0000 (0:00:00.466) 0:00:26.219 ********* 2026-04-07 00:53:35.833416 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-07 00:53:35.833423 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 00:53:35.833429 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-07 00:53:35.833435 | orchestrator | 2026-04-07 00:53:35.833442 | orchestrator | TASK [keystone : Set fact with the generated cron jobs for building the crontab later] *** 2026-04-07 00:53:35.833449 | orchestrator | Tuesday 07 April 2026 00:53:18 +0000 (0:00:01.138) 0:00:27.357 ********* 2026-04-07 00:53:35.833455 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:53:35.833574 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:53:35.833586 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:53:35.833597 | orchestrator | 2026-04-07 00:53:35.833606 | orchestrator | TASK [keystone : Copying files for keystone-fernet] **************************** 2026-04-07 00:53:35.833616 | orchestrator | Tuesday 07 April 2026 00:53:18 +0000 (0:00:00.255) 0:00:27.613 ********* 2026-04-07 00:53:35.833625 | orchestrator | changed: [testbed-node-0] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-04-07 00:53:35.833635 | orchestrator | changed: [testbed-node-1] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-04-07 00:53:35.833645 | orchestrator | changed: [testbed-node-2] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-04-07 00:53:35.833656 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-04-07 00:53:35.833667 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-04-07 00:53:35.833746 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-04-07 00:53:35.833760 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-04-07 00:53:35.833771 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-04-07 00:53:35.833782 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-04-07 00:53:35.833793 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-04-07 00:53:35.833803 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-04-07 00:53:35.833814 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-04-07 00:53:35.833825 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-04-07 00:53:35.833837 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-04-07 00:53:35.833857 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-04-07 00:53:35.833867 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-04-07 00:53:35.833877 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-04-07 00:53:35.833886 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-04-07 00:53:35.833897 | orchestrator | changed: [testbed-node-1] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-04-07 00:53:35.833915 | orchestrator | changed: [testbed-node-0] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-04-07 00:53:35.833926 | orchestrator | changed: [testbed-node-2] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-04-07 00:53:35.833936 | orchestrator | 2026-04-07 00:53:35.833946 | orchestrator | TASK [keystone : Copying files for keystone-ssh] ******************************* 2026-04-07 00:53:35.833957 | orchestrator | Tuesday 07 April 2026 00:53:27 +0000 (0:00:08.932) 0:00:36.545 ********* 2026-04-07 00:53:35.833967 | orchestrator | changed: [testbed-node-0] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-04-07 00:53:35.833978 | orchestrator | changed: [testbed-node-1] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-04-07 00:53:35.833989 | orchestrator | changed: [testbed-node-2] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-04-07 00:53:35.834000 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-04-07 00:53:35.834010 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-04-07 00:53:35.834093 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-04-07 00:53:35.834103 | orchestrator | 2026-04-07 00:53:35.834114 | orchestrator | TASK [service-check-containers : keystone | Check containers] ****************** 2026-04-07 00:53:35.834123 | orchestrator | Tuesday 07 April 2026 00:53:30 +0000 (0:00:02.361) 0:00:38.907 ********* 2026-04-07 00:53:35.834131 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.834147 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.834170 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-07 00:53:35.834178 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.834185 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.834193 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-07 00:53:35.834205 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.834213 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.834227 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-07 00:53:35.834236 | orchestrator | 2026-04-07 00:53:35.834246 | orchestrator | TASK [service-check-containers : keystone | Notify handlers to restart containers] *** 2026-04-07 00:53:35.834266 | orchestrator | Tuesday 07 April 2026 00:53:32 +0000 (0:00:02.095) 0:00:41.003 ********* 2026-04-07 00:53:35.834278 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 00:53:35.834288 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:53:35.834299 | orchestrator | } 2026-04-07 00:53:35.834309 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 00:53:35.834318 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:53:35.834328 | orchestrator | } 2026-04-07 00:53:35.834338 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 00:53:35.834348 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 00:53:35.834358 | orchestrator | } 2026-04-07 00:53:35.834368 | orchestrator | 2026-04-07 00:53:35.834379 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 00:53:35.834389 | orchestrator | Tuesday 07 April 2026 00:53:32 +0000 (0:00:00.295) 0:00:41.298 ********* 2026-04-07 00:53:35.834401 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.834554 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.834567 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.834574 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.834589 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.834601 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.834608 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.834621 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.834628 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-07 00:53:35.834635 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-07 00:53:35.834642 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-07 00:53:35.834649 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.834655 | orchestrator | 2026-04-07 00:53:35.834661 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-07 00:53:35.834668 | orchestrator | Tuesday 07 April 2026 00:53:33 +0000 (0:00:00.831) 0:00:42.130 ********* 2026-04-07 00:53:35.834677 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:53:35.834710 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:53:35.834721 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:53:35.834731 | orchestrator | 2026-04-07 00:53:35.834740 | orchestrator | TASK [keystone : Creating keystone database] *********************************** 2026-04-07 00:53:35.834750 | orchestrator | Tuesday 07 April 2026 00:53:33 +0000 (0:00:00.236) 0:00:42.366 ********* 2026-04-07 00:53:35.834766 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:53:35.834777 | orchestrator | 2026-04-07 00:53:35.834787 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:53:35.834799 | orchestrator | testbed-node-0 : ok=18  changed=10  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-07 00:53:35.834812 | orchestrator | testbed-node-1 : ok=16  changed=10  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2026-04-07 00:53:35.834832 | orchestrator | testbed-node-2 : ok=16  changed=10  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2026-04-07 00:53:35.834839 | orchestrator | 2026-04-07 00:53:35.834845 | orchestrator | 2026-04-07 00:53:35.834852 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:53:35.834858 | orchestrator | Tuesday 07 April 2026 00:53:34 +0000 (0:00:00.774) 0:00:43.141 ********* 2026-04-07 00:53:35.834864 | orchestrator | =============================================================================== 2026-04-07 00:53:35.834871 | orchestrator | keystone : Copying files for keystone-fernet ---------------------------- 8.93s 2026-04-07 00:53:35.834877 | orchestrator | keystone : Copying over keystone.conf ----------------------------------- 5.96s 2026-04-07 00:53:35.834883 | orchestrator | service-cert-copy : keystone | Copying over extra CA certificates ------- 3.34s 2026-04-07 00:53:35.834889 | orchestrator | keystone : Copying over config.json files for services ------------------ 3.21s 2026-04-07 00:53:35.834895 | orchestrator | keystone : Copying files for keystone-ssh ------------------------------- 2.36s 2026-04-07 00:53:35.834902 | orchestrator | keystone : Ensuring config directories exist ---------------------------- 2.29s 2026-04-07 00:53:35.834908 | orchestrator | service-check-containers : keystone | Check containers ------------------ 2.10s 2026-04-07 00:53:35.834914 | orchestrator | keystone : Copying over wsgi-keystone.conf ------------------------------ 1.60s 2026-04-07 00:53:35.834920 | orchestrator | keystone : Copying keystone-startup script for keystone ----------------- 1.45s 2026-04-07 00:53:35.834927 | orchestrator | keystone : Generate the required cron jobs for the node ----------------- 1.14s 2026-04-07 00:53:35.834933 | orchestrator | keystone : Checking whether keystone-paste.ini file exists -------------- 1.08s 2026-04-07 00:53:35.834939 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS key ---- 0.95s 2026-04-07 00:53:35.834945 | orchestrator | keystone : Check if Keystone domain-specific config is supplied --------- 0.84s 2026-04-07 00:53:35.834951 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.83s 2026-04-07 00:53:35.834958 | orchestrator | keystone : Creating keystone database ----------------------------------- 0.77s 2026-04-07 00:53:35.834964 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS certificate --- 0.67s 2026-04-07 00:53:35.834970 | orchestrator | keystone : Create Keystone domain-specific config directory ------------- 0.62s 2026-04-07 00:53:35.834977 | orchestrator | keystone : include_tasks ------------------------------------------------ 0.60s 2026-04-07 00:53:35.834983 | orchestrator | keystone : Copying over existing policy file ---------------------------- 0.58s 2026-04-07 00:53:35.834989 | orchestrator | keystone : include_tasks ------------------------------------------------ 0.52s 2026-04-07 00:53:35.834995 | orchestrator | 2026-04-07 00:53:35 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:35.835002 | orchestrator | 2026-04-07 00:53:35 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:35.835008 | orchestrator | 2026-04-07 00:53:35 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:35.836077 | orchestrator | 2026-04-07 00:53:35 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:35.837032 | orchestrator | 2026-04-07 00:53:35 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:35.837103 | orchestrator | 2026-04-07 00:53:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:38.870984 | orchestrator | 2026-04-07 00:53:38 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:38.871055 | orchestrator | 2026-04-07 00:53:38 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:38.871252 | orchestrator | 2026-04-07 00:53:38 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:38.872019 | orchestrator | 2026-04-07 00:53:38 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:38.872596 | orchestrator | 2026-04-07 00:53:38 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:38.872616 | orchestrator | 2026-04-07 00:53:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:41.907803 | orchestrator | 2026-04-07 00:53:41 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:41.910892 | orchestrator | 2026-04-07 00:53:41 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:41.913217 | orchestrator | 2026-04-07 00:53:41 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:41.915217 | orchestrator | 2026-04-07 00:53:41 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:41.916741 | orchestrator | 2026-04-07 00:53:41 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:41.916814 | orchestrator | 2026-04-07 00:53:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:44.958517 | orchestrator | 2026-04-07 00:53:44 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:44.960640 | orchestrator | 2026-04-07 00:53:44 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:44.965468 | orchestrator | 2026-04-07 00:53:44 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:44.968659 | orchestrator | 2026-04-07 00:53:44 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:44.971915 | orchestrator | 2026-04-07 00:53:44 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:44.971964 | orchestrator | 2026-04-07 00:53:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:48.029617 | orchestrator | 2026-04-07 00:53:48 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:48.031869 | orchestrator | 2026-04-07 00:53:48 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:48.034762 | orchestrator | 2026-04-07 00:53:48 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:48.036764 | orchestrator | 2026-04-07 00:53:48 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:48.038281 | orchestrator | 2026-04-07 00:53:48 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:48.039644 | orchestrator | 2026-04-07 00:53:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:51.081138 | orchestrator | 2026-04-07 00:53:51 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:51.084183 | orchestrator | 2026-04-07 00:53:51 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:51.086999 | orchestrator | 2026-04-07 00:53:51 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:51.089336 | orchestrator | 2026-04-07 00:53:51 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:51.091618 | orchestrator | 2026-04-07 00:53:51 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:51.091676 | orchestrator | 2026-04-07 00:53:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:54.142122 | orchestrator | 2026-04-07 00:53:54 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:54.144953 | orchestrator | 2026-04-07 00:53:54 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:54.147114 | orchestrator | 2026-04-07 00:53:54 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:54.151663 | orchestrator | 2026-04-07 00:53:54 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:54.153839 | orchestrator | 2026-04-07 00:53:54 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:54.154118 | orchestrator | 2026-04-07 00:53:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:53:57.195394 | orchestrator | 2026-04-07 00:53:57 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:53:57.196670 | orchestrator | 2026-04-07 00:53:57 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:53:57.197932 | orchestrator | 2026-04-07 00:53:57 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:53:57.199163 | orchestrator | 2026-04-07 00:53:57 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:53:57.200398 | orchestrator | 2026-04-07 00:53:57 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:53:57.200425 | orchestrator | 2026-04-07 00:53:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:00.243540 | orchestrator | 2026-04-07 00:54:00 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:00.248415 | orchestrator | 2026-04-07 00:54:00 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:00.251275 | orchestrator | 2026-04-07 00:54:00 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:00.251340 | orchestrator | 2026-04-07 00:54:00 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:00.251350 | orchestrator | 2026-04-07 00:54:00 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:00.251359 | orchestrator | 2026-04-07 00:54:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:03.292074 | orchestrator | 2026-04-07 00:54:03 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:03.292672 | orchestrator | 2026-04-07 00:54:03 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:03.294166 | orchestrator | 2026-04-07 00:54:03 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:03.294955 | orchestrator | 2026-04-07 00:54:03 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:03.295777 | orchestrator | 2026-04-07 00:54:03 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:03.295925 | orchestrator | 2026-04-07 00:54:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:06.339337 | orchestrator | 2026-04-07 00:54:06 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:06.341582 | orchestrator | 2026-04-07 00:54:06 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:06.344104 | orchestrator | 2026-04-07 00:54:06 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:06.345404 | orchestrator | 2026-04-07 00:54:06 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:06.347150 | orchestrator | 2026-04-07 00:54:06 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:06.347189 | orchestrator | 2026-04-07 00:54:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:09.379944 | orchestrator | 2026-04-07 00:54:09 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:09.380643 | orchestrator | 2026-04-07 00:54:09 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:09.381575 | orchestrator | 2026-04-07 00:54:09 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:09.382648 | orchestrator | 2026-04-07 00:54:09 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:09.384877 | orchestrator | 2026-04-07 00:54:09 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:09.384915 | orchestrator | 2026-04-07 00:54:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:12.422108 | orchestrator | 2026-04-07 00:54:12 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:12.422687 | orchestrator | 2026-04-07 00:54:12 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:12.423856 | orchestrator | 2026-04-07 00:54:12 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:12.425170 | orchestrator | 2026-04-07 00:54:12 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:12.426332 | orchestrator | 2026-04-07 00:54:12 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:12.426395 | orchestrator | 2026-04-07 00:54:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:15.469871 | orchestrator | 2026-04-07 00:54:15 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:15.471689 | orchestrator | 2026-04-07 00:54:15 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:15.473074 | orchestrator | 2026-04-07 00:54:15 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:15.474482 | orchestrator | 2026-04-07 00:54:15 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:15.475763 | orchestrator | 2026-04-07 00:54:15 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:15.475798 | orchestrator | 2026-04-07 00:54:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:18.528179 | orchestrator | 2026-04-07 00:54:18 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:18.530488 | orchestrator | 2026-04-07 00:54:18 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:18.536561 | orchestrator | 2026-04-07 00:54:18 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:18.538863 | orchestrator | 2026-04-07 00:54:18 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:18.540501 | orchestrator | 2026-04-07 00:54:18 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:18.540553 | orchestrator | 2026-04-07 00:54:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:21.601546 | orchestrator | 2026-04-07 00:54:21 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:21.603199 | orchestrator | 2026-04-07 00:54:21 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:21.604953 | orchestrator | 2026-04-07 00:54:21 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:21.606581 | orchestrator | 2026-04-07 00:54:21 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:21.608082 | orchestrator | 2026-04-07 00:54:21 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:21.608134 | orchestrator | 2026-04-07 00:54:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:24.658501 | orchestrator | 2026-04-07 00:54:24 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:24.659975 | orchestrator | 2026-04-07 00:54:24 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:24.661538 | orchestrator | 2026-04-07 00:54:24 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:24.662717 | orchestrator | 2026-04-07 00:54:24 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:24.664119 | orchestrator | 2026-04-07 00:54:24 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:24.664324 | orchestrator | 2026-04-07 00:54:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:27.710281 | orchestrator | 2026-04-07 00:54:27 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:27.711949 | orchestrator | 2026-04-07 00:54:27 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:27.713625 | orchestrator | 2026-04-07 00:54:27 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:27.715146 | orchestrator | 2026-04-07 00:54:27 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:27.716472 | orchestrator | 2026-04-07 00:54:27 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:27.716511 | orchestrator | 2026-04-07 00:54:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:30.762793 | orchestrator | 2026-04-07 00:54:30 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:30.764224 | orchestrator | 2026-04-07 00:54:30 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:30.765581 | orchestrator | 2026-04-07 00:54:30 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state STARTED 2026-04-07 00:54:30.767836 | orchestrator | 2026-04-07 00:54:30 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:30.768875 | orchestrator | 2026-04-07 00:54:30 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:30.768919 | orchestrator | 2026-04-07 00:54:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:33.813938 | orchestrator | 2026-04-07 00:54:33 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state STARTED 2026-04-07 00:54:33.816110 | orchestrator | 2026-04-07 00:54:33 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:33.816840 | orchestrator | 2026-04-07 00:54:33 | INFO  | Task 474725fb-15cc-48b6-9db9-b94b86127872 is in state SUCCESS 2026-04-07 00:54:33.818202 | orchestrator | 2026-04-07 00:54:33 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:33.820234 | orchestrator | 2026-04-07 00:54:33 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:33.820275 | orchestrator | 2026-04-07 00:54:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:36.865129 | orchestrator | 2026-04-07 00:54:36 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:36.868309 | orchestrator | 2026-04-07 00:54:36 | INFO  | Task 7247046c-d94c-4479-b7b1-f55fbefeb884 is in state SUCCESS 2026-04-07 00:54:36.868602 | orchestrator | 2026-04-07 00:54:36 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state STARTED 2026-04-07 00:54:36.869550 | orchestrator | 2026-04-07 00:54:36 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state STARTED 2026-04-07 00:54:36.870080 | orchestrator | 2026-04-07 00:54:36 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:36.870841 | orchestrator | 2026-04-07 00:54:36 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:36.870863 | orchestrator | 2026-04-07 00:54:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:39.918315 | orchestrator | 2026-04-07 00:54:39 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:39.920445 | orchestrator | 2026-04-07 00:54:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:39.922698 | orchestrator | 2026-04-07 00:54:39 | INFO  | Task 550d6533-7538-4823-90ab-da9e1dbe6f4d is in state SUCCESS 2026-04-07 00:54:39.923321 | orchestrator | 2026-04-07 00:54:39.923357 | orchestrator | 2026-04-07 00:54:39.923367 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:54:39.923376 | orchestrator | 2026-04-07 00:54:39.923382 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:54:39.923390 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.266) 0:00:00.266 ********* 2026-04-07 00:54:39.923396 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:54:39.923405 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:54:39.923412 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:54:39.923418 | orchestrator | 2026-04-07 00:54:39.923424 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:54:39.923432 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.256) 0:00:00.523 ********* 2026-04-07 00:54:39.923439 | orchestrator | ok: [testbed-node-0] => (item=enable_designate_True) 2026-04-07 00:54:39.923447 | orchestrator | ok: [testbed-node-1] => (item=enable_designate_True) 2026-04-07 00:54:39.923453 | orchestrator | ok: [testbed-node-2] => (item=enable_designate_True) 2026-04-07 00:54:39.923460 | orchestrator | 2026-04-07 00:54:39.923466 | orchestrator | PLAY [Apply role designate] **************************************************** 2026-04-07 00:54:39.923473 | orchestrator | 2026-04-07 00:54:39.923480 | orchestrator | TASK [designate : include_tasks] *********************************************** 2026-04-07 00:54:39.923488 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.299) 0:00:00.823 ********* 2026-04-07 00:54:39.923495 | orchestrator | included: /ansible/roles/designate/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:54:39.923502 | orchestrator | 2026-04-07 00:54:39.923509 | orchestrator | TASK [service-ks-register : designate | Creating/deleting services] ************ 2026-04-07 00:54:39.923516 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.557) 0:00:01.381 ********* 2026-04-07 00:54:39.923522 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (5 retries left). 2026-04-07 00:54:39.923529 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (4 retries left). 2026-04-07 00:54:39.923535 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (3 retries left). 2026-04-07 00:54:39.923541 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (2 retries left). 2026-04-07 00:54:39.923547 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (1 retries left). 2026-04-07 00:54:39.923556 | orchestrator | failed: [testbed-node-0] (item=designate (dns)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Designate DNS Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9001"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9001"}], "name": "designate", "type": "dns"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:54:39.923586 | orchestrator | 2026-04-07 00:54:39.923593 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:54:39.923600 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.923609 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.923618 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.923625 | orchestrator | 2026-04-07 00:54:39.923631 | orchestrator | 2026-04-07 00:54:39.923638 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:54:39.923644 | orchestrator | Tuesday 07 April 2026 00:54:33 +0000 (0:00:53.772) 0:00:55.154 ********* 2026-04-07 00:54:39.923666 | orchestrator | =============================================================================== 2026-04-07 00:54:39.923672 | orchestrator | service-ks-register : designate | Creating/deleting services ----------- 53.77s 2026-04-07 00:54:39.923678 | orchestrator | designate : include_tasks ----------------------------------------------- 0.56s 2026-04-07 00:54:39.923684 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.30s 2026-04-07 00:54:39.923689 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.26s 2026-04-07 00:54:39.923695 | orchestrator | 2026-04-07 00:54:39.923701 | orchestrator | 2026-04-07 00:54:39.923707 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:54:39.923713 | orchestrator | 2026-04-07 00:54:39.923719 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:54:39.923724 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.541) 0:00:00.541 ********* 2026-04-07 00:54:39.923730 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:54:39.923736 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:54:39.923741 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:54:39.923749 | orchestrator | 2026-04-07 00:54:39.923754 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:54:39.923760 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.330) 0:00:00.872 ********* 2026-04-07 00:54:39.923766 | orchestrator | ok: [testbed-node-0] => (item=enable_barbican_True) 2026-04-07 00:54:39.923773 | orchestrator | ok: [testbed-node-1] => (item=enable_barbican_True) 2026-04-07 00:54:39.923779 | orchestrator | ok: [testbed-node-2] => (item=enable_barbican_True) 2026-04-07 00:54:39.923784 | orchestrator | 2026-04-07 00:54:39.923790 | orchestrator | PLAY [Apply role barbican] ***************************************************** 2026-04-07 00:54:39.923796 | orchestrator | 2026-04-07 00:54:39.923815 | orchestrator | TASK [barbican : include_tasks] ************************************************ 2026-04-07 00:54:39.923874 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.246) 0:00:01.119 ********* 2026-04-07 00:54:39.923880 | orchestrator | included: /ansible/roles/barbican/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:54:39.923886 | orchestrator | 2026-04-07 00:54:39.923892 | orchestrator | TASK [service-ks-register : barbican | Creating/deleting services] ************* 2026-04-07 00:54:39.923899 | orchestrator | Tuesday 07 April 2026 00:53:40 +0000 (0:00:00.560) 0:00:01.679 ********* 2026-04-07 00:54:39.923906 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (5 retries left). 2026-04-07 00:54:39.923912 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (4 retries left). 2026-04-07 00:54:39.923919 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (3 retries left). 2026-04-07 00:54:39.923925 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (2 retries left). 2026-04-07 00:54:39.923940 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (1 retries left). 2026-04-07 00:54:39.923946 | orchestrator | failed: [testbed-node-0] (item=barbican (key-manager)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Barbican Key Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9311"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9311"}], "name": "barbican", "type": "key-manager"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:54:39.923953 | orchestrator | 2026-04-07 00:54:39.923958 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:54:39.923962 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.923967 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.923972 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.923977 | orchestrator | 2026-04-07 00:54:39.923981 | orchestrator | 2026-04-07 00:54:39.923985 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:54:39.923990 | orchestrator | Tuesday 07 April 2026 00:54:33 +0000 (0:00:53.680) 0:00:55.360 ********* 2026-04-07 00:54:39.923994 | orchestrator | =============================================================================== 2026-04-07 00:54:39.923999 | orchestrator | service-ks-register : barbican | Creating/deleting services ------------ 53.68s 2026-04-07 00:54:39.924003 | orchestrator | barbican : include_tasks ------------------------------------------------ 0.56s 2026-04-07 00:54:39.924008 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.33s 2026-04-07 00:54:39.924012 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.25s 2026-04-07 00:54:39.924017 | orchestrator | 2026-04-07 00:54:39.924021 | orchestrator | 2026-04-07 00:54:39.924026 | orchestrator | PLAY [Download ironic ipa images] ********************************************** 2026-04-07 00:54:39.924030 | orchestrator | 2026-04-07 00:54:39.924034 | orchestrator | TASK [Ensure the destination directory exists] ********************************* 2026-04-07 00:54:39.924039 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.088) 0:00:00.088 ********* 2026-04-07 00:54:39.924043 | orchestrator | changed: [localhost] 2026-04-07 00:54:39.924048 | orchestrator | 2026-04-07 00:54:39.924052 | orchestrator | TASK [Download ironic-agent initramfs] ***************************************** 2026-04-07 00:54:39.924057 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.739) 0:00:00.828 ********* 2026-04-07 00:54:39.924061 | orchestrator | changed: [localhost] 2026-04-07 00:54:39.924066 | orchestrator | 2026-04-07 00:54:39.924076 | orchestrator | TASK [Download ironic-agent kernel] ******************************************** 2026-04-07 00:54:39.924080 | orchestrator | Tuesday 07 April 2026 00:54:11 +0000 (0:00:31.836) 0:00:32.664 ********* 2026-04-07 00:54:39.924085 | orchestrator | FAILED - RETRYING: [localhost]: Download ironic-agent kernel (3 retries left). 2026-04-07 00:54:39.924089 | orchestrator | changed: [localhost] 2026-04-07 00:54:39.924095 | orchestrator | 2026-04-07 00:54:39.924101 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:54:39.924107 | orchestrator | 2026-04-07 00:54:39.924112 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:54:39.924118 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:25.891) 0:00:58.555 ********* 2026-04-07 00:54:39.924124 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:54:39.924130 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:54:39.924136 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:54:39.924142 | orchestrator | 2026-04-07 00:54:39.924148 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:54:39.924159 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.354) 0:00:58.910 ********* 2026-04-07 00:54:39.924166 | orchestrator | ok: [testbed-node-0] => (item=enable_ironic_False) 2026-04-07 00:54:39.924171 | orchestrator | ok: [testbed-node-2] => (item=enable_ironic_False) 2026-04-07 00:54:39.924175 | orchestrator | ok: [testbed-node-1] => (item=enable_ironic_False) 2026-04-07 00:54:39.924179 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: enable_ironic_True 2026-04-07 00:54:39.924182 | orchestrator | 2026-04-07 00:54:39.924186 | orchestrator | PLAY [Apply role ironic] ******************************************************* 2026-04-07 00:54:39.924190 | orchestrator | skipping: no hosts matched 2026-04-07 00:54:39.924194 | orchestrator | 2026-04-07 00:54:39.924203 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:54:39.924207 | orchestrator | localhost : ok=3  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.924211 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.924215 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.924218 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:54:39.924222 | orchestrator | 2026-04-07 00:54:39.924226 | orchestrator | 2026-04-07 00:54:39.924230 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:54:39.924234 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.592) 0:00:59.502 ********* 2026-04-07 00:54:39.924238 | orchestrator | =============================================================================== 2026-04-07 00:54:39.924241 | orchestrator | Download ironic-agent initramfs ---------------------------------------- 31.84s 2026-04-07 00:54:39.924245 | orchestrator | Download ironic-agent kernel ------------------------------------------- 25.89s 2026-04-07 00:54:39.924249 | orchestrator | Ensure the destination directory exists --------------------------------- 0.74s 2026-04-07 00:54:39.924253 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.59s 2026-04-07 00:54:39.924257 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.35s 2026-04-07 00:54:39.925102 | orchestrator | 2026-04-07 00:54:39 | INFO  | Task 39f5bd76-5fe1-462d-ae31-5c66813d92b5 is in state SUCCESS 2026-04-07 00:54:39.927689 | orchestrator | 2026-04-07 00:54:39 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:39.929775 | orchestrator | 2026-04-07 00:54:39 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:39.930273 | orchestrator | 2026-04-07 00:54:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:42.991316 | orchestrator | 2026-04-07 00:54:42 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:42.993557 | orchestrator | 2026-04-07 00:54:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:42.995931 | orchestrator | 2026-04-07 00:54:42 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:42.997919 | orchestrator | 2026-04-07 00:54:42 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:42.997973 | orchestrator | 2026-04-07 00:54:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:46.042677 | orchestrator | 2026-04-07 00:54:46 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:46.043471 | orchestrator | 2026-04-07 00:54:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:46.043998 | orchestrator | 2026-04-07 00:54:46 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:46.044810 | orchestrator | 2026-04-07 00:54:46 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:46.044935 | orchestrator | 2026-04-07 00:54:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:49.101179 | orchestrator | 2026-04-07 00:54:49 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:49.105152 | orchestrator | 2026-04-07 00:54:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:49.108174 | orchestrator | 2026-04-07 00:54:49 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:49.110284 | orchestrator | 2026-04-07 00:54:49 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:49.110342 | orchestrator | 2026-04-07 00:54:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:52.169189 | orchestrator | 2026-04-07 00:54:52 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:52.170616 | orchestrator | 2026-04-07 00:54:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:52.172169 | orchestrator | 2026-04-07 00:54:52 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:52.172703 | orchestrator | 2026-04-07 00:54:52 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:52.173028 | orchestrator | 2026-04-07 00:54:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:55.223848 | orchestrator | 2026-04-07 00:54:55 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:55.225096 | orchestrator | 2026-04-07 00:54:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:55.226634 | orchestrator | 2026-04-07 00:54:55 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:55.228383 | orchestrator | 2026-04-07 00:54:55 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:55.228426 | orchestrator | 2026-04-07 00:54:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:54:58.274297 | orchestrator | 2026-04-07 00:54:58 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:54:58.275705 | orchestrator | 2026-04-07 00:54:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:54:58.278985 | orchestrator | 2026-04-07 00:54:58 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:54:58.280449 | orchestrator | 2026-04-07 00:54:58 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:54:58.280497 | orchestrator | 2026-04-07 00:54:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:01.338260 | orchestrator | 2026-04-07 00:55:01 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:01.339783 | orchestrator | 2026-04-07 00:55:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:01.341344 | orchestrator | 2026-04-07 00:55:01 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:01.343809 | orchestrator | 2026-04-07 00:55:01 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:01.343942 | orchestrator | 2026-04-07 00:55:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:04.391637 | orchestrator | 2026-04-07 00:55:04 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:04.393768 | orchestrator | 2026-04-07 00:55:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:04.395790 | orchestrator | 2026-04-07 00:55:04 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:04.398249 | orchestrator | 2026-04-07 00:55:04 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:04.398298 | orchestrator | 2026-04-07 00:55:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:07.450363 | orchestrator | 2026-04-07 00:55:07 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:07.451387 | orchestrator | 2026-04-07 00:55:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:07.452168 | orchestrator | 2026-04-07 00:55:07 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:07.453433 | orchestrator | 2026-04-07 00:55:07 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:07.453468 | orchestrator | 2026-04-07 00:55:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:10.505056 | orchestrator | 2026-04-07 00:55:10 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:10.507840 | orchestrator | 2026-04-07 00:55:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:10.510257 | orchestrator | 2026-04-07 00:55:10 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:10.512572 | orchestrator | 2026-04-07 00:55:10 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:10.512660 | orchestrator | 2026-04-07 00:55:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:13.562606 | orchestrator | 2026-04-07 00:55:13 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:13.565325 | orchestrator | 2026-04-07 00:55:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:13.566987 | orchestrator | 2026-04-07 00:55:13 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:13.568700 | orchestrator | 2026-04-07 00:55:13 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:13.568739 | orchestrator | 2026-04-07 00:55:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:16.613068 | orchestrator | 2026-04-07 00:55:16 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:16.615128 | orchestrator | 2026-04-07 00:55:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:16.619550 | orchestrator | 2026-04-07 00:55:16 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:16.621994 | orchestrator | 2026-04-07 00:55:16 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:16.622095 | orchestrator | 2026-04-07 00:55:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:19.678711 | orchestrator | 2026-04-07 00:55:19 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:19.680685 | orchestrator | 2026-04-07 00:55:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:19.682067 | orchestrator | 2026-04-07 00:55:19 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:19.683713 | orchestrator | 2026-04-07 00:55:19 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state STARTED 2026-04-07 00:55:19.683884 | orchestrator | 2026-04-07 00:55:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:22.738591 | orchestrator | 2026-04-07 00:55:22 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:22.740066 | orchestrator | 2026-04-07 00:55:22 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:22.741759 | orchestrator | 2026-04-07 00:55:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:22.743715 | orchestrator | 2026-04-07 00:55:22 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:22.749849 | orchestrator | 2026-04-07 00:55:22 | INFO  | Task 256c75b3-216d-4ba2-becc-497bee457b4d is in state SUCCESS 2026-04-07 00:55:22.752617 | orchestrator | 2026-04-07 00:55:22.752787 | orchestrator | 2026-04-07 00:55:22.752807 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:55:22.752822 | orchestrator | 2026-04-07 00:55:22.752836 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:55:22.752849 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.520) 0:00:00.520 ********* 2026-04-07 00:55:22.752857 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.752865 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.752873 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.752880 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.752887 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.752895 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.752920 | orchestrator | 2026-04-07 00:55:22.752928 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:55:22.752936 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.494) 0:00:01.015 ********* 2026-04-07 00:55:22.752947 | orchestrator | ok: [testbed-node-0] => (item=enable_neutron_True) 2026-04-07 00:55:22.752958 | orchestrator | ok: [testbed-node-1] => (item=enable_neutron_True) 2026-04-07 00:55:22.753515 | orchestrator | ok: [testbed-node-2] => (item=enable_neutron_True) 2026-04-07 00:55:22.753540 | orchestrator | ok: [testbed-node-3] => (item=enable_neutron_True) 2026-04-07 00:55:22.753548 | orchestrator | ok: [testbed-node-4] => (item=enable_neutron_True) 2026-04-07 00:55:22.753555 | orchestrator | ok: [testbed-node-5] => (item=enable_neutron_True) 2026-04-07 00:55:22.753563 | orchestrator | 2026-04-07 00:55:22.753571 | orchestrator | PLAY [Apply role neutron] ****************************************************** 2026-04-07 00:55:22.753589 | orchestrator | 2026-04-07 00:55:22.753632 | orchestrator | TASK [neutron : include_tasks] ************************************************* 2026-04-07 00:55:22.753641 | orchestrator | Tuesday 07 April 2026 00:53:39 +0000 (0:00:00.587) 0:00:01.603 ********* 2026-04-07 00:55:22.753690 | orchestrator | included: /ansible/roles/neutron/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.753701 | orchestrator | 2026-04-07 00:55:22.753708 | orchestrator | TASK [neutron : Get container facts] ******************************************* 2026-04-07 00:55:22.753716 | orchestrator | Tuesday 07 April 2026 00:53:40 +0000 (0:00:00.980) 0:00:02.583 ********* 2026-04-07 00:55:22.753724 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.753731 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.753738 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.753746 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.753753 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.753760 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.753767 | orchestrator | 2026-04-07 00:55:22.753775 | orchestrator | TASK [neutron : Get container volume facts] ************************************ 2026-04-07 00:55:22.753782 | orchestrator | Tuesday 07 April 2026 00:53:42 +0000 (0:00:01.547) 0:00:04.130 ********* 2026-04-07 00:55:22.753790 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.753804 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.753812 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.753872 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.754412 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.754431 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.754439 | orchestrator | 2026-04-07 00:55:22.754447 | orchestrator | TASK [neutron : Check for ML2/OVN presence] ************************************ 2026-04-07 00:55:22.754454 | orchestrator | Tuesday 07 April 2026 00:53:43 +0000 (0:00:01.082) 0:00:05.213 ********* 2026-04-07 00:55:22.754462 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.754469 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.754477 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.754484 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.754491 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.754499 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.754506 | orchestrator | 2026-04-07 00:55:22.754587 | orchestrator | TASK [neutron : Check for ML2/OVS presence] ************************************ 2026-04-07 00:55:22.754605 | orchestrator | Tuesday 07 April 2026 00:53:44 +0000 (0:00:00.547) 0:00:05.760 ********* 2026-04-07 00:55:22.754613 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.754620 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.754628 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.754635 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.754643 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.754650 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.754664 | orchestrator | 2026-04-07 00:55:22.754672 | orchestrator | TASK [service-ks-register : neutron | Creating/deleting services] ************** 2026-04-07 00:55:22.754877 | orchestrator | Tuesday 07 April 2026 00:53:44 +0000 (0:00:00.629) 0:00:06.389 ********* 2026-04-07 00:55:22.754886 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (5 retries left). 2026-04-07 00:55:22.754894 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (4 retries left). 2026-04-07 00:55:22.754918 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (3 retries left). 2026-04-07 00:55:22.754926 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (2 retries left). 2026-04-07 00:55:22.754934 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (1 retries left). 2026-04-07 00:55:22.754942 | orchestrator | failed: [testbed-node-0] (item=neutron (network)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Openstack Networking", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9696"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9696"}], "name": "neutron", "type": "network"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:55:22.754952 | orchestrator | 2026-04-07 00:55:22.754960 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:55:22.755033 | orchestrator | testbed-node-0 : ok=5  changed=0 unreachable=0 failed=1  skipped=2  rescued=0 ignored=0 2026-04-07 00:55:22.755045 | orchestrator | testbed-node-1 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:55:22.755062 | orchestrator | testbed-node-2 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:55:22.755070 | orchestrator | testbed-node-3 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:55:22.755077 | orchestrator | testbed-node-4 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:55:22.755085 | orchestrator | testbed-node-5 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:55:22.755092 | orchestrator | 2026-04-07 00:55:22.755110 | orchestrator | 2026-04-07 00:55:22.755117 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:55:22.755125 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:53.157) 0:00:59.547 ********* 2026-04-07 00:55:22.755132 | orchestrator | =============================================================================== 2026-04-07 00:55:22.755140 | orchestrator | service-ks-register : neutron | Creating/deleting services ------------- 53.16s 2026-04-07 00:55:22.755147 | orchestrator | neutron : Get container facts ------------------------------------------- 1.55s 2026-04-07 00:55:22.755160 | orchestrator | neutron : Get container volume facts ------------------------------------ 1.08s 2026-04-07 00:55:22.755238 | orchestrator | neutron : include_tasks ------------------------------------------------- 0.98s 2026-04-07 00:55:22.755250 | orchestrator | neutron : Check for ML2/OVS presence ------------------------------------ 0.63s 2026-04-07 00:55:22.755258 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.59s 2026-04-07 00:55:22.755266 | orchestrator | neutron : Check for ML2/OVN presence ------------------------------------ 0.55s 2026-04-07 00:55:22.755274 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.49s 2026-04-07 00:55:22.755282 | orchestrator | 2026-04-07 00:55:22.755289 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-07 00:55:22.755297 | orchestrator | 2.16.14 2026-04-07 00:55:22.755305 | orchestrator | 2026-04-07 00:55:22.755313 | orchestrator | PLAY [Prepare deployment of Ceph services] ************************************* 2026-04-07 00:55:22.755320 | orchestrator | 2026-04-07 00:55:22.755328 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2026-04-07 00:55:22.755335 | orchestrator | Tuesday 07 April 2026 00:45:04 +0000 (0:00:00.663) 0:00:00.663 ********* 2026-04-07 00:55:22.755343 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.755351 | orchestrator | 2026-04-07 00:55:22.755358 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2026-04-07 00:55:22.755366 | orchestrator | Tuesday 07 April 2026 00:45:05 +0000 (0:00:00.970) 0:00:01.633 ********* 2026-04-07 00:55:22.755373 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.755381 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.755389 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.755396 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.755404 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.755411 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.755419 | orchestrator | 2026-04-07 00:55:22.755427 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2026-04-07 00:55:22.755434 | orchestrator | Tuesday 07 April 2026 00:45:07 +0000 (0:00:01.947) 0:00:03.580 ********* 2026-04-07 00:55:22.755442 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.755449 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.755457 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.755465 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.755472 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.755480 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.755685 | orchestrator | 2026-04-07 00:55:22.755694 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2026-04-07 00:55:22.755702 | orchestrator | Tuesday 07 April 2026 00:45:07 +0000 (0:00:00.596) 0:00:04.177 ********* 2026-04-07 00:55:22.755709 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.755716 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.755934 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.755992 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.756178 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.756195 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.756208 | orchestrator | 2026-04-07 00:55:22.756221 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2026-04-07 00:55:22.756234 | orchestrator | Tuesday 07 April 2026 00:45:08 +0000 (0:00:00.804) 0:00:04.981 ********* 2026-04-07 00:55:22.756257 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.756269 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.756282 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.756294 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.756306 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.756318 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.756330 | orchestrator | 2026-04-07 00:55:22.757046 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2026-04-07 00:55:22.757071 | orchestrator | Tuesday 07 April 2026 00:45:09 +0000 (0:00:00.690) 0:00:05.672 ********* 2026-04-07 00:55:22.757084 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.757097 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.757109 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.757121 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.757133 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.757145 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.757156 | orchestrator | 2026-04-07 00:55:22.757262 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2026-04-07 00:55:22.757278 | orchestrator | Tuesday 07 April 2026 00:45:10 +0000 (0:00:00.609) 0:00:06.282 ********* 2026-04-07 00:55:22.757290 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.757302 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.757314 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.757326 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.757339 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.757366 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.757379 | orchestrator | 2026-04-07 00:55:22.757391 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2026-04-07 00:55:22.757404 | orchestrator | Tuesday 07 April 2026 00:45:10 +0000 (0:00:00.777) 0:00:07.059 ********* 2026-04-07 00:55:22.757417 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.757429 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.757442 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.757454 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.757466 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.757478 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.757491 | orchestrator | 2026-04-07 00:55:22.757503 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2026-04-07 00:55:22.757516 | orchestrator | Tuesday 07 April 2026 00:45:11 +0000 (0:00:00.950) 0:00:08.010 ********* 2026-04-07 00:55:22.757528 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.757540 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.757553 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.757566 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.757578 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.757590 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.757603 | orchestrator | 2026-04-07 00:55:22.757615 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2026-04-07 00:55:22.757627 | orchestrator | Tuesday 07 April 2026 00:45:12 +0000 (0:00:00.715) 0:00:08.725 ********* 2026-04-07 00:55:22.757647 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:55:22.757660 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:55:22.757672 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:55:22.757685 | orchestrator | 2026-04-07 00:55:22.757697 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2026-04-07 00:55:22.757710 | orchestrator | Tuesday 07 April 2026 00:45:13 +0000 (0:00:01.052) 0:00:09.777 ********* 2026-04-07 00:55:22.757722 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.757734 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.757747 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.757759 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.757771 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.757784 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.757808 | orchestrator | 2026-04-07 00:55:22.757820 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2026-04-07 00:55:22.757832 | orchestrator | Tuesday 07 April 2026 00:45:14 +0000 (0:00:01.410) 0:00:11.188 ********* 2026-04-07 00:55:22.757845 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:55:22.757857 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:55:22.757870 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:55:22.757883 | orchestrator | 2026-04-07 00:55:22.757896 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2026-04-07 00:55:22.757924 | orchestrator | Tuesday 07 April 2026 00:45:19 +0000 (0:00:04.556) 0:00:15.744 ********* 2026-04-07 00:55:22.757941 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-07 00:55:22.757963 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-07 00:55:22.757986 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-07 00:55:22.758010 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.758068 | orchestrator | 2026-04-07 00:55:22.758092 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2026-04-07 00:55:22.758116 | orchestrator | Tuesday 07 April 2026 00:45:20 +0000 (0:00:00.902) 0:00:16.646 ********* 2026-04-07 00:55:22.758142 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758167 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758192 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758213 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.758232 | orchestrator | 2026-04-07 00:55:22.758244 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2026-04-07 00:55:22.758257 | orchestrator | Tuesday 07 April 2026 00:45:22 +0000 (0:00:01.937) 0:00:18.584 ********* 2026-04-07 00:55:22.758348 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758370 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758395 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758409 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.758422 | orchestrator | 2026-04-07 00:55:22.758435 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2026-04-07 00:55:22.758458 | orchestrator | Tuesday 07 April 2026 00:45:22 +0000 (0:00:00.500) 0:00:19.085 ********* 2026-04-07 00:55:22.758478 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2026-04-07 00:45:16.588137', 'end': '2026-04-07 00:45:16.678748', 'delta': '0:00:00.090611', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758495 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2026-04-07 00:45:17.605241', 'end': '2026-04-07 00:45:17.708954', 'delta': '0:00:00.103713', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758508 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2026-04-07 00:45:19.158492', 'end': '2026-04-07 00:45:19.274697', 'delta': '0:00:00.116205', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.758519 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.758528 | orchestrator | 2026-04-07 00:55:22.758541 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2026-04-07 00:55:22.758551 | orchestrator | Tuesday 07 April 2026 00:45:23 +0000 (0:00:00.481) 0:00:19.566 ********* 2026-04-07 00:55:22.758567 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.758582 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.758594 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.758607 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.758619 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.758630 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.758640 | orchestrator | 2026-04-07 00:55:22.758650 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2026-04-07 00:55:22.758661 | orchestrator | Tuesday 07 April 2026 00:45:25 +0000 (0:00:02.647) 0:00:22.213 ********* 2026-04-07 00:55:22.758672 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.758684 | orchestrator | 2026-04-07 00:55:22.758696 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2026-04-07 00:55:22.758791 | orchestrator | Tuesday 07 April 2026 00:45:26 +0000 (0:00:00.717) 0:00:22.930 ********* 2026-04-07 00:55:22.758845 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.758855 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.758862 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.758870 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.758877 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.758893 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.758915 | orchestrator | 2026-04-07 00:55:22.758923 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2026-04-07 00:55:22.758931 | orchestrator | Tuesday 07 April 2026 00:45:27 +0000 (0:00:00.948) 0:00:23.879 ********* 2026-04-07 00:55:22.758938 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.758945 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.758952 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.758960 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.758966 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.758973 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.758980 | orchestrator | 2026-04-07 00:55:22.758987 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-07 00:55:22.758993 | orchestrator | Tuesday 07 April 2026 00:45:28 +0000 (0:00:01.243) 0:00:25.122 ********* 2026-04-07 00:55:22.759000 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759007 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759013 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759020 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759027 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759033 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759040 | orchestrator | 2026-04-07 00:55:22.759047 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2026-04-07 00:55:22.759053 | orchestrator | Tuesday 07 April 2026 00:45:29 +0000 (0:00:00.944) 0:00:26.067 ********* 2026-04-07 00:55:22.759060 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759067 | orchestrator | 2026-04-07 00:55:22.759078 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2026-04-07 00:55:22.759085 | orchestrator | Tuesday 07 April 2026 00:45:30 +0000 (0:00:00.262) 0:00:26.329 ********* 2026-04-07 00:55:22.759092 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759099 | orchestrator | 2026-04-07 00:55:22.759106 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-07 00:55:22.759113 | orchestrator | Tuesday 07 April 2026 00:45:30 +0000 (0:00:00.228) 0:00:26.558 ********* 2026-04-07 00:55:22.759119 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759126 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759133 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759139 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759146 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759153 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759159 | orchestrator | 2026-04-07 00:55:22.759166 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2026-04-07 00:55:22.759173 | orchestrator | Tuesday 07 April 2026 00:45:30 +0000 (0:00:00.679) 0:00:27.237 ********* 2026-04-07 00:55:22.759180 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759186 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759193 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759200 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759206 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759213 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759220 | orchestrator | 2026-04-07 00:55:22.759226 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2026-04-07 00:55:22.759233 | orchestrator | Tuesday 07 April 2026 00:45:31 +0000 (0:00:00.972) 0:00:28.210 ********* 2026-04-07 00:55:22.759240 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759246 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759253 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759260 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759267 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759273 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759280 | orchestrator | 2026-04-07 00:55:22.759287 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2026-04-07 00:55:22.759297 | orchestrator | Tuesday 07 April 2026 00:45:32 +0000 (0:00:00.435) 0:00:28.646 ********* 2026-04-07 00:55:22.759304 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759311 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759318 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759324 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759331 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759338 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759344 | orchestrator | 2026-04-07 00:55:22.759351 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2026-04-07 00:55:22.759358 | orchestrator | Tuesday 07 April 2026 00:45:32 +0000 (0:00:00.568) 0:00:29.215 ********* 2026-04-07 00:55:22.759365 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759371 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759378 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759385 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759391 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759398 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759406 | orchestrator | 2026-04-07 00:55:22.759415 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2026-04-07 00:55:22.759423 | orchestrator | Tuesday 07 April 2026 00:45:33 +0000 (0:00:00.665) 0:00:29.880 ********* 2026-04-07 00:55:22.759430 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759438 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759445 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759453 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759461 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759469 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759477 | orchestrator | 2026-04-07 00:55:22.759485 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2026-04-07 00:55:22.759493 | orchestrator | Tuesday 07 April 2026 00:45:34 +0000 (0:00:01.217) 0:00:31.098 ********* 2026-04-07 00:55:22.759501 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.759509 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.759516 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.759596 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.759615 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.759639 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.759652 | orchestrator | 2026-04-07 00:55:22.759663 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2026-04-07 00:55:22.759675 | orchestrator | Tuesday 07 April 2026 00:45:35 +0000 (0:00:00.638) 0:00:31.736 ********* 2026-04-07 00:55:22.759688 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2', 'dm-uuid-LVM-qrJ0lEo0sbfKYWJnOUkfPiYNIdhxxy3DFJOxYc3XSynkbT8r9ZAsZinTdj4C3pwv'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759708 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722', 'dm-uuid-LVM-nUbhE8JyxWI4yIlTiMfwGTfCsIQCTAaXH2kS21Y9fbfPK9wfe5kU86dUi9uvkF2I'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759721 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759748 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759761 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759772 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759783 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759795 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759882 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759926 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.759947 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.759970 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-1uSdjJ-FReU-Bejh-1mIK-IacW-k7Ls-RcaZks', 'scsi-0QEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76', 'scsi-SQEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760070 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2e1P9d-pfP2-eXUQ-ccQa-yPfw-fE2I-SCS1ff', 'scsi-0QEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88', 'scsi-SQEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760103 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d', 'scsi-SQEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760121 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-22-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760142 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b', 'dm-uuid-LVM-yC88MiN3PryvE0fvbIhwr0IrRujbsAZCfMAG8Wujapp4JvfYCrcYMxGdRouUeoG9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760154 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4', 'dm-uuid-LVM-uQuuByfRrNJe2RSEgkC5hvKAsqpHeQMR8CPgZcXk6LO0dL9kCsyBt1HiTJmi4USt'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760166 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760177 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760189 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760273 | orchestrator | skipping: [testbed-node-4] => (item={'key': 2026-04-07 00:55:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:22.760304 | orchestrator | 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760318 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760330 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.760348 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760368 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760380 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760485 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760507 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-u2UQbS-QvW9-d0NA-ibFW-i6XF-xBrx-eZdX0A', 'scsi-0QEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba', 'scsi-SQEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760556 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-7nqGLU-eOnu-DWE9-Bjej-9NH7-c88D-6HT0bS', 'scsi-0QEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6', 'scsi-SQEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760569 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696', 'scsi-SQEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760582 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-17-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760595 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c', 'dm-uuid-LVM-78tLYoniV2zuzKbpFSVYh6asI7K2E633YlBjjslh7SRkoyZBrDaNagtVPi2vq3sj'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760608 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.760667 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7', 'dm-uuid-LVM-ARf5D8B94Jgn5F8asnJUBcF8eZEuUPcfWT1TpcC3liLoLTzUKcmVjwveJKBatcEE'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760678 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760686 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760712 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760721 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760728 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760734 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760741 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760749 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760756 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760805 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760819 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part1', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part14', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part15', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part16', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760840 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-29-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.760848 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.760855 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760863 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760957 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760975 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.760999 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761010 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761018 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761025 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761031 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761038 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761045 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761052 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761102 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761128 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761136 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761146 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761153 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761219 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part1', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part14', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part15', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part16', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761246 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-08-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761253 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.761264 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761272 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part1', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part14', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part15', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part16', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761322 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-19-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761346 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761354 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.761362 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761372 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:55:22.761380 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761439 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-krTUGj-5S7a-soWH-3nId-RGto-83dV-k7X561', 'scsi-0QEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd', 'scsi-SQEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761470 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-vSHFoZ-edkC-yorr-qFsk-jrGs-AtMj-CzgncF', 'scsi-0QEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349', 'scsi-SQEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761482 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8', 'scsi-SQEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761490 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:55:22.761497 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.761504 | orchestrator | 2026-04-07 00:55:22.761511 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2026-04-07 00:55:22.761518 | orchestrator | Tuesday 07 April 2026 00:45:36 +0000 (0:00:01.189) 0:00:32.925 ********* 2026-04-07 00:55:22.761526 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2', 'dm-uuid-LVM-qrJ0lEo0sbfKYWJnOUkfPiYNIdhxxy3DFJOxYc3XSynkbT8r9ZAsZinTdj4C3pwv'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761534 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722', 'dm-uuid-LVM-nUbhE8JyxWI4yIlTiMfwGTfCsIQCTAaXH2kS21Y9fbfPK9wfe5kU86dUi9uvkF2I'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761591 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761602 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761613 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761639 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761647 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761654 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761667 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761749 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761776 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761790 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b', 'dm-uuid-LVM-yC88MiN3PryvE0fvbIhwr0IrRujbsAZCfMAG8Wujapp4JvfYCrcYMxGdRouUeoG9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761887 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-1uSdjJ-FReU-Bejh-1mIK-IacW-k7Ls-RcaZks', 'scsi-0QEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76', 'scsi-SQEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761975 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4', 'dm-uuid-LVM-uQuuByfRrNJe2RSEgkC5hvKAsqpHeQMR8CPgZcXk6LO0dL9kCsyBt1HiTJmi4USt'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.761992 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2e1P9d-pfP2-eXUQ-ccQa-yPfw-fE2I-SCS1ff', 'scsi-0QEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88', 'scsi-SQEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762003 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d', 'scsi-SQEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762050 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762140 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-22-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762186 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762208 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762222 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762233 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762253 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762264 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762359 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762378 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c', 'dm-uuid-LVM-78tLYoniV2zuzKbpFSVYh6asI7K2E633YlBjjslh7SRkoyZBrDaNagtVPi2vq3sj'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762391 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7', 'dm-uuid-LVM-ARf5D8B94Jgn5F8asnJUBcF8eZEuUPcfWT1TpcC3liLoLTzUKcmVjwveJKBatcEE'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762509 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762545 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762562 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762574 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.762587 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762601 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-u2UQbS-QvW9-d0NA-ibFW-i6XF-xBrx-eZdX0A', 'scsi-0QEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba', 'scsi-SQEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762620 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762737 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762757 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762773 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762784 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-7nqGLU-eOnu-DWE9-Bjej-9NH7-c88D-6HT0bS', 'scsi-0QEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6', 'scsi-SQEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762804 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762817 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762925 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762947 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696', 'scsi-SQEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762965 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762978 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.762999 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763011 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-17-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763133 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763156 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763177 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.763191 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763204 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763291 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-krTUGj-5S7a-soWH-3nId-RGto-83dV-k7X561', 'scsi-0QEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd', 'scsi-SQEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763312 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763330 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763351 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-vSHFoZ-edkC-yorr-qFsk-jrGs-AtMj-CzgncF', 'scsi-0QEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349', 'scsi-SQEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763364 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763377 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763469 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8', 'scsi-SQEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763505 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763517 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763615 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part1', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part14', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part15', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part16', 'scsi-SQEMU_QEMU_HARDDISK_d03c7bab-3b09-492f-89a4-e7206370e450-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763635 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763652 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763672 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763685 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763693 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.763701 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763776 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-19-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763788 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763795 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.763806 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763819 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763868 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part1', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part14', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part15', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part16', 'scsi-SQEMU_QEMU_HARDDISK_0c3908cd-9ae7-4f32-bcd9-16913c42debe-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763887 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-08-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763895 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.763920 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763933 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763940 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.763996 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part1', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part14', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part15', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part16', 'scsi-SQEMU_QEMU_HARDDISK_b9ec3702-6661-45b7-a0cf-93bf4acfc295-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.764016 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-29-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:55:22.764023 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764030 | orchestrator | 2026-04-07 00:55:22.764048 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2026-04-07 00:55:22.764056 | orchestrator | Tuesday 07 April 2026 00:45:37 +0000 (0:00:00.855) 0:00:33.781 ********* 2026-04-07 00:55:22.764063 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.764070 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.764077 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.764084 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.764091 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.764098 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.764105 | orchestrator | 2026-04-07 00:55:22.764112 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2026-04-07 00:55:22.764119 | orchestrator | Tuesday 07 April 2026 00:45:38 +0000 (0:00:01.325) 0:00:35.106 ********* 2026-04-07 00:55:22.764126 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.764133 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.764139 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.764146 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.764153 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.764160 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.764167 | orchestrator | 2026-04-07 00:55:22.764173 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-07 00:55:22.764180 | orchestrator | Tuesday 07 April 2026 00:45:39 +0000 (0:00:00.783) 0:00:35.890 ********* 2026-04-07 00:55:22.764187 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764194 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764201 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764208 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.764215 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764221 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.764228 | orchestrator | 2026-04-07 00:55:22.764235 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-07 00:55:22.764242 | orchestrator | Tuesday 07 April 2026 00:45:40 +0000 (0:00:00.997) 0:00:36.888 ********* 2026-04-07 00:55:22.764249 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764263 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764270 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764277 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.764283 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764290 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.764297 | orchestrator | 2026-04-07 00:55:22.764304 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-07 00:55:22.764311 | orchestrator | Tuesday 07 April 2026 00:45:41 +0000 (0:00:00.650) 0:00:37.538 ********* 2026-04-07 00:55:22.764317 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764324 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764331 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764338 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.764345 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764352 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.764359 | orchestrator | 2026-04-07 00:55:22.764370 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-07 00:55:22.764377 | orchestrator | Tuesday 07 April 2026 00:45:41 +0000 (0:00:00.579) 0:00:38.117 ********* 2026-04-07 00:55:22.764384 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764391 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764397 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764404 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.764411 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764418 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.764425 | orchestrator | 2026-04-07 00:55:22.764474 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2026-04-07 00:55:22.764491 | orchestrator | Tuesday 07 April 2026 00:45:42 +0000 (0:00:00.987) 0:00:39.105 ********* 2026-04-07 00:55:22.764499 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2026-04-07 00:55:22.764506 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2026-04-07 00:55:22.764513 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-07 00:55:22.764520 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-0) 2026-04-07 00:55:22.764527 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2026-04-07 00:55:22.764534 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2026-04-07 00:55:22.764541 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-0) 2026-04-07 00:55:22.764548 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2026-04-07 00:55:22.764555 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2026-04-07 00:55:22.764561 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2026-04-07 00:55:22.764568 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-1) 2026-04-07 00:55:22.764575 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2026-04-07 00:55:22.764582 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-2) 2026-04-07 00:55:22.764588 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2026-04-07 00:55:22.764595 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-1) 2026-04-07 00:55:22.764602 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-2) 2026-04-07 00:55:22.764609 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2026-04-07 00:55:22.764615 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2026-04-07 00:55:22.764622 | orchestrator | 2026-04-07 00:55:22.764629 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2026-04-07 00:55:22.764639 | orchestrator | Tuesday 07 April 2026 00:45:45 +0000 (0:00:02.850) 0:00:41.956 ********* 2026-04-07 00:55:22.764647 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-07 00:55:22.764653 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-07 00:55:22.764660 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-07 00:55:22.764667 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764674 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-04-07 00:55:22.764681 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-04-07 00:55:22.764687 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-04-07 00:55:22.764694 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764701 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-04-07 00:55:22.764708 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-04-07 00:55:22.764714 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-04-07 00:55:22.764721 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764728 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:55:22.764735 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:55:22.764741 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:55:22.764748 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.764755 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-04-07 00:55:22.764766 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-04-07 00:55:22.764773 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-04-07 00:55:22.764780 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764787 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-04-07 00:55:22.764794 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-04-07 00:55:22.764800 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-04-07 00:55:22.764807 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.764814 | orchestrator | 2026-04-07 00:55:22.764821 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2026-04-07 00:55:22.764828 | orchestrator | Tuesday 07 April 2026 00:45:47 +0000 (0:00:01.461) 0:00:43.417 ********* 2026-04-07 00:55:22.764835 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.764841 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.764848 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.764855 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.764862 | orchestrator | 2026-04-07 00:55:22.764869 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-04-07 00:55:22.764876 | orchestrator | Tuesday 07 April 2026 00:45:48 +0000 (0:00:01.524) 0:00:44.941 ********* 2026-04-07 00:55:22.764883 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764890 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764896 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764914 | orchestrator | 2026-04-07 00:55:22.764921 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-04-07 00:55:22.764927 | orchestrator | Tuesday 07 April 2026 00:45:49 +0000 (0:00:00.767) 0:00:45.709 ********* 2026-04-07 00:55:22.764934 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764941 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764948 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764954 | orchestrator | 2026-04-07 00:55:22.764961 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-04-07 00:55:22.764968 | orchestrator | Tuesday 07 April 2026 00:45:50 +0000 (0:00:00.638) 0:00:46.348 ********* 2026-04-07 00:55:22.764974 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.764981 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.764988 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.764994 | orchestrator | 2026-04-07 00:55:22.765001 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-04-07 00:55:22.765027 | orchestrator | Tuesday 07 April 2026 00:45:50 +0000 (0:00:00.680) 0:00:47.028 ********* 2026-04-07 00:55:22.765035 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.765042 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.765048 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.765055 | orchestrator | 2026-04-07 00:55:22.765062 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-04-07 00:55:22.765069 | orchestrator | Tuesday 07 April 2026 00:45:51 +0000 (0:00:00.916) 0:00:47.944 ********* 2026-04-07 00:55:22.765076 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.765082 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.765089 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.765096 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.765103 | orchestrator | 2026-04-07 00:55:22.765110 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-04-07 00:55:22.765118 | orchestrator | Tuesday 07 April 2026 00:45:52 +0000 (0:00:00.719) 0:00:48.664 ********* 2026-04-07 00:55:22.765126 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.765134 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.765146 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.765154 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.765161 | orchestrator | 2026-04-07 00:55:22.765169 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-04-07 00:55:22.765177 | orchestrator | Tuesday 07 April 2026 00:45:53 +0000 (0:00:00.689) 0:00:49.354 ********* 2026-04-07 00:55:22.765185 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.765193 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.765207 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.765219 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.765234 | orchestrator | 2026-04-07 00:55:22.765252 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-04-07 00:55:22.765264 | orchestrator | Tuesday 07 April 2026 00:45:53 +0000 (0:00:00.374) 0:00:49.728 ********* 2026-04-07 00:55:22.765275 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.765287 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.765299 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.765311 | orchestrator | 2026-04-07 00:55:22.765323 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-04-07 00:55:22.765335 | orchestrator | Tuesday 07 April 2026 00:45:53 +0000 (0:00:00.374) 0:00:50.103 ********* 2026-04-07 00:55:22.765348 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-04-07 00:55:22.765361 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-07 00:55:22.765374 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-07 00:55:22.765387 | orchestrator | 2026-04-07 00:55:22.765399 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2026-04-07 00:55:22.765412 | orchestrator | Tuesday 07 April 2026 00:45:54 +0000 (0:00:00.713) 0:00:50.817 ********* 2026-04-07 00:55:22.765424 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:55:22.765436 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:55:22.765449 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:55:22.765460 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-07 00:55:22.765473 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-07 00:55:22.765485 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-07 00:55:22.765498 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-07 00:55:22.765509 | orchestrator | 2026-04-07 00:55:22.765522 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2026-04-07 00:55:22.765534 | orchestrator | Tuesday 07 April 2026 00:45:55 +0000 (0:00:00.943) 0:00:51.760 ********* 2026-04-07 00:55:22.765545 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:55:22.765557 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:55:22.765569 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:55:22.765581 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-07 00:55:22.765593 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-07 00:55:22.765606 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-07 00:55:22.765618 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-07 00:55:22.765626 | orchestrator | 2026-04-07 00:55:22.765632 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.765639 | orchestrator | Tuesday 07 April 2026 00:45:57 +0000 (0:00:01.927) 0:00:53.688 ********* 2026-04-07 00:55:22.765646 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.765661 | orchestrator | 2026-04-07 00:55:22.765668 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.765675 | orchestrator | Tuesday 07 April 2026 00:45:58 +0000 (0:00:01.019) 0:00:54.707 ********* 2026-04-07 00:55:22.765706 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.765714 | orchestrator | 2026-04-07 00:55:22.765721 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.765728 | orchestrator | Tuesday 07 April 2026 00:45:59 +0000 (0:00:01.079) 0:00:55.786 ********* 2026-04-07 00:55:22.765735 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.765741 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.765748 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.765755 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.765762 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.765769 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.765776 | orchestrator | 2026-04-07 00:55:22.765783 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.765790 | orchestrator | Tuesday 07 April 2026 00:46:00 +0000 (0:00:01.038) 0:00:56.825 ********* 2026-04-07 00:55:22.765796 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.765803 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.765810 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.765817 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.765824 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.765830 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.765837 | orchestrator | 2026-04-07 00:55:22.765844 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.765851 | orchestrator | Tuesday 07 April 2026 00:46:01 +0000 (0:00:01.065) 0:00:57.890 ********* 2026-04-07 00:55:22.765858 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.765864 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.765871 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.765878 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.765885 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.765892 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.765917 | orchestrator | 2026-04-07 00:55:22.765927 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.765942 | orchestrator | Tuesday 07 April 2026 00:46:02 +0000 (0:00:00.887) 0:00:58.777 ********* 2026-04-07 00:55:22.765949 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.765955 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.765962 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.765969 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.765975 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.765982 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.765989 | orchestrator | 2026-04-07 00:55:22.765995 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.766002 | orchestrator | Tuesday 07 April 2026 00:46:03 +0000 (0:00:01.318) 0:01:00.096 ********* 2026-04-07 00:55:22.766009 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766048 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766057 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766064 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.766071 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.766077 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.766084 | orchestrator | 2026-04-07 00:55:22.766091 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.766098 | orchestrator | Tuesday 07 April 2026 00:46:04 +0000 (0:00:01.121) 0:01:01.218 ********* 2026-04-07 00:55:22.766105 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766116 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766123 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766130 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766136 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766143 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766150 | orchestrator | 2026-04-07 00:55:22.766156 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.766163 | orchestrator | Tuesday 07 April 2026 00:46:05 +0000 (0:00:00.837) 0:01:02.056 ********* 2026-04-07 00:55:22.766170 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766177 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766183 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766190 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766196 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766203 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766210 | orchestrator | 2026-04-07 00:55:22.766217 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.766223 | orchestrator | Tuesday 07 April 2026 00:46:06 +0000 (0:00:00.559) 0:01:02.616 ********* 2026-04-07 00:55:22.766230 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.766237 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.766243 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.766250 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.766257 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.766263 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.766270 | orchestrator | 2026-04-07 00:55:22.766277 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.766284 | orchestrator | Tuesday 07 April 2026 00:46:07 +0000 (0:00:01.553) 0:01:04.170 ********* 2026-04-07 00:55:22.766291 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.766297 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.766304 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.766310 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.766317 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.766324 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.766330 | orchestrator | 2026-04-07 00:55:22.766337 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.766344 | orchestrator | Tuesday 07 April 2026 00:46:09 +0000 (0:00:01.393) 0:01:05.564 ********* 2026-04-07 00:55:22.766351 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766357 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766364 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766371 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766378 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766384 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766391 | orchestrator | 2026-04-07 00:55:22.766398 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.766405 | orchestrator | Tuesday 07 April 2026 00:46:10 +0000 (0:00:00.769) 0:01:06.333 ********* 2026-04-07 00:55:22.766411 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766418 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766425 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766431 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.766459 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.766467 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.766473 | orchestrator | 2026-04-07 00:55:22.766480 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.766487 | orchestrator | Tuesday 07 April 2026 00:46:10 +0000 (0:00:00.578) 0:01:06.912 ********* 2026-04-07 00:55:22.766494 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.766501 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.766507 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.766514 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766521 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766533 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766540 | orchestrator | 2026-04-07 00:55:22.766547 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.766554 | orchestrator | Tuesday 07 April 2026 00:46:11 +0000 (0:00:00.730) 0:01:07.643 ********* 2026-04-07 00:55:22.766563 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.766574 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.766593 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.766604 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766615 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766626 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766636 | orchestrator | 2026-04-07 00:55:22.766648 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.766659 | orchestrator | Tuesday 07 April 2026 00:46:11 +0000 (0:00:00.595) 0:01:08.239 ********* 2026-04-07 00:55:22.766670 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.766682 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.766694 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.766705 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766717 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766725 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766732 | orchestrator | 2026-04-07 00:55:22.766738 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.766750 | orchestrator | Tuesday 07 April 2026 00:46:12 +0000 (0:00:00.824) 0:01:09.064 ********* 2026-04-07 00:55:22.766757 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766763 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766770 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766777 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766784 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766790 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766797 | orchestrator | 2026-04-07 00:55:22.766804 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.766810 | orchestrator | Tuesday 07 April 2026 00:46:13 +0000 (0:00:00.653) 0:01:09.717 ********* 2026-04-07 00:55:22.766817 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766824 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766830 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766837 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.766844 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.766851 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.766857 | orchestrator | 2026-04-07 00:55:22.766864 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.766871 | orchestrator | Tuesday 07 April 2026 00:46:14 +0000 (0:00:00.601) 0:01:10.318 ********* 2026-04-07 00:55:22.766877 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.766884 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.766891 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.766974 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.766985 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.766991 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.766998 | orchestrator | 2026-04-07 00:55:22.767005 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.767012 | orchestrator | Tuesday 07 April 2026 00:46:14 +0000 (0:00:00.663) 0:01:10.981 ********* 2026-04-07 00:55:22.767019 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.767026 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.767033 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.767039 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.767046 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.767053 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.767060 | orchestrator | 2026-04-07 00:55:22.767067 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.767074 | orchestrator | Tuesday 07 April 2026 00:46:15 +0000 (0:00:00.897) 0:01:11.879 ********* 2026-04-07 00:55:22.767089 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.767100 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.767116 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.767130 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.767141 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.767152 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.767162 | orchestrator | 2026-04-07 00:55:22.767173 | orchestrator | TASK [ceph-container-common : Generate systemd ceph target file] *************** 2026-04-07 00:55:22.767184 | orchestrator | Tuesday 07 April 2026 00:46:16 +0000 (0:00:01.381) 0:01:13.261 ********* 2026-04-07 00:55:22.767193 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.767203 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.767213 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.767225 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.767236 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.767249 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.767260 | orchestrator | 2026-04-07 00:55:22.767271 | orchestrator | TASK [ceph-container-common : Enable ceph.target] ****************************** 2026-04-07 00:55:22.767282 | orchestrator | Tuesday 07 April 2026 00:46:19 +0000 (0:00:02.170) 0:01:15.432 ********* 2026-04-07 00:55:22.767293 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.767304 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.767314 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.767323 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.767329 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.767336 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.767342 | orchestrator | 2026-04-07 00:55:22.767351 | orchestrator | TASK [ceph-container-common : Include prerequisites.yml] *********************** 2026-04-07 00:55:22.767366 | orchestrator | Tuesday 07 April 2026 00:46:21 +0000 (0:00:02.594) 0:01:18.027 ********* 2026-04-07 00:55:22.767413 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/prerequisites.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.767427 | orchestrator | 2026-04-07 00:55:22.767438 | orchestrator | TASK [ceph-container-common : Stop lvmetad] ************************************ 2026-04-07 00:55:22.767448 | orchestrator | Tuesday 07 April 2026 00:46:23 +0000 (0:00:01.278) 0:01:19.306 ********* 2026-04-07 00:55:22.767459 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.767470 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.767480 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.767490 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.767500 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.767510 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.767521 | orchestrator | 2026-04-07 00:55:22.767533 | orchestrator | TASK [ceph-container-common : Disable and mask lvmetad service] **************** 2026-04-07 00:55:22.767544 | orchestrator | Tuesday 07 April 2026 00:46:23 +0000 (0:00:00.881) 0:01:20.187 ********* 2026-04-07 00:55:22.767551 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.767557 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.767563 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.767570 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.767576 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.767582 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.767589 | orchestrator | 2026-04-07 00:55:22.767595 | orchestrator | TASK [ceph-container-common : Remove ceph udev rules] ************************** 2026-04-07 00:55:22.767602 | orchestrator | Tuesday 07 April 2026 00:46:24 +0000 (0:00:00.647) 0:01:20.835 ********* 2026-04-07 00:55:22.767608 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-07 00:55:22.767614 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-07 00:55:22.767621 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-07 00:55:22.767632 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-07 00:55:22.767645 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-07 00:55:22.767652 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-07 00:55:22.767658 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-07 00:55:22.767664 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-07 00:55:22.767670 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-07 00:55:22.767677 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-07 00:55:22.767683 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-07 00:55:22.767690 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-07 00:55:22.767701 | orchestrator | 2026-04-07 00:55:22.767712 | orchestrator | TASK [ceph-container-common : Ensure tmpfiles.d is present] ******************** 2026-04-07 00:55:22.767722 | orchestrator | Tuesday 07 April 2026 00:46:26 +0000 (0:00:02.188) 0:01:23.023 ********* 2026-04-07 00:55:22.767732 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.767742 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.767752 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.767762 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.767771 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.767782 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.767792 | orchestrator | 2026-04-07 00:55:22.767803 | orchestrator | TASK [ceph-container-common : Restore certificates selinux context] ************ 2026-04-07 00:55:22.767813 | orchestrator | Tuesday 07 April 2026 00:46:28 +0000 (0:00:01.294) 0:01:24.318 ********* 2026-04-07 00:55:22.767825 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.767835 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.767845 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.767856 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.767867 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.767876 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.767887 | orchestrator | 2026-04-07 00:55:22.767915 | orchestrator | TASK [ceph-container-common : Install python3 on osd nodes] ******************** 2026-04-07 00:55:22.767927 | orchestrator | Tuesday 07 April 2026 00:46:29 +0000 (0:00:01.327) 0:01:25.645 ********* 2026-04-07 00:55:22.767937 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.767948 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.767959 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.767970 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.767977 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.767983 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.767989 | orchestrator | 2026-04-07 00:55:22.767995 | orchestrator | TASK [ceph-container-common : Include registry.yml] **************************** 2026-04-07 00:55:22.768002 | orchestrator | Tuesday 07 April 2026 00:46:30 +0000 (0:00:01.011) 0:01:26.657 ********* 2026-04-07 00:55:22.768008 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768014 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768020 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768027 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768033 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768039 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768045 | orchestrator | 2026-04-07 00:55:22.768051 | orchestrator | TASK [ceph-container-common : Include fetch_image.yml] ************************* 2026-04-07 00:55:22.768058 | orchestrator | Tuesday 07 April 2026 00:46:31 +0000 (0:00:00.796) 0:01:27.453 ********* 2026-04-07 00:55:22.768064 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/fetch_image.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.768071 | orchestrator | 2026-04-07 00:55:22.768083 | orchestrator | TASK [ceph-container-common : Pulling Ceph container image] ******************** 2026-04-07 00:55:22.768113 | orchestrator | Tuesday 07 April 2026 00:46:32 +0000 (0:00:01.175) 0:01:28.629 ********* 2026-04-07 00:55:22.768121 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.768127 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.768134 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.768140 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.768146 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.768152 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.768159 | orchestrator | 2026-04-07 00:55:22.768165 | orchestrator | TASK [ceph-container-common : Pulling alertmanager/prometheus/grafana container images] *** 2026-04-07 00:55:22.768172 | orchestrator | Tuesday 07 April 2026 00:47:15 +0000 (0:00:42.984) 0:02:11.614 ********* 2026-04-07 00:55:22.768178 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-07 00:55:22.768184 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-07 00:55:22.768191 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-07 00:55:22.768197 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768203 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-07 00:55:22.768209 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-07 00:55:22.768216 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-07 00:55:22.768222 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768228 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-07 00:55:22.768234 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-07 00:55:22.768241 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-07 00:55:22.768247 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768258 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-07 00:55:22.768264 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-07 00:55:22.768271 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-07 00:55:22.768277 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768283 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-07 00:55:22.768289 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-07 00:55:22.768296 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-07 00:55:22.768302 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768308 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-07 00:55:22.768314 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-07 00:55:22.768321 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-07 00:55:22.768327 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768333 | orchestrator | 2026-04-07 00:55:22.768340 | orchestrator | TASK [ceph-container-common : Pulling node-exporter container image] *********** 2026-04-07 00:55:22.768346 | orchestrator | Tuesday 07 April 2026 00:47:16 +0000 (0:00:00.975) 0:02:12.589 ********* 2026-04-07 00:55:22.768352 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768358 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768365 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768371 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768377 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768383 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768390 | orchestrator | 2026-04-07 00:55:22.768396 | orchestrator | TASK [ceph-container-common : Export local ceph dev image] ********************* 2026-04-07 00:55:22.768402 | orchestrator | Tuesday 07 April 2026 00:47:16 +0000 (0:00:00.628) 0:02:13.217 ********* 2026-04-07 00:55:22.768412 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768418 | orchestrator | 2026-04-07 00:55:22.768425 | orchestrator | TASK [ceph-container-common : Copy ceph dev image file] ************************ 2026-04-07 00:55:22.768431 | orchestrator | Tuesday 07 April 2026 00:47:17 +0000 (0:00:00.133) 0:02:13.351 ********* 2026-04-07 00:55:22.768437 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768444 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768450 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768456 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768462 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768468 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768475 | orchestrator | 2026-04-07 00:55:22.768481 | orchestrator | TASK [ceph-container-common : Load ceph dev image] ***************************** 2026-04-07 00:55:22.768487 | orchestrator | Tuesday 07 April 2026 00:47:17 +0000 (0:00:00.834) 0:02:14.185 ********* 2026-04-07 00:55:22.768494 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768500 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768506 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768512 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768518 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768525 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768531 | orchestrator | 2026-04-07 00:55:22.768537 | orchestrator | TASK [ceph-container-common : Remove tmp ceph dev image file] ****************** 2026-04-07 00:55:22.768543 | orchestrator | Tuesday 07 April 2026 00:47:18 +0000 (0:00:00.580) 0:02:14.765 ********* 2026-04-07 00:55:22.768550 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768556 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768562 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768568 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768574 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768581 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768587 | orchestrator | 2026-04-07 00:55:22.768593 | orchestrator | TASK [ceph-container-common : Get ceph version] ******************************** 2026-04-07 00:55:22.768599 | orchestrator | Tuesday 07 April 2026 00:47:19 +0000 (0:00:00.826) 0:02:15.592 ********* 2026-04-07 00:55:22.768606 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.768612 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.768633 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.768640 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.768647 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.768653 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.768659 | orchestrator | 2026-04-07 00:55:22.768665 | orchestrator | TASK [ceph-container-common : Set_fact ceph_version ceph_version.stdout.split] *** 2026-04-07 00:55:22.768672 | orchestrator | Tuesday 07 April 2026 00:47:22 +0000 (0:00:03.607) 0:02:19.200 ********* 2026-04-07 00:55:22.768678 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.768685 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.768691 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.768697 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.768704 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.768710 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.768716 | orchestrator | 2026-04-07 00:55:22.768722 | orchestrator | TASK [ceph-container-common : Include release.yml] ***************************** 2026-04-07 00:55:22.768729 | orchestrator | Tuesday 07 April 2026 00:47:23 +0000 (0:00:00.642) 0:02:19.842 ********* 2026-04-07 00:55:22.768736 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/release.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.768742 | orchestrator | 2026-04-07 00:55:22.768749 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release jewel] ********************* 2026-04-07 00:55:22.768755 | orchestrator | Tuesday 07 April 2026 00:47:24 +0000 (0:00:01.139) 0:02:20.982 ********* 2026-04-07 00:55:22.768762 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768772 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768778 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768785 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768791 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768797 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768804 | orchestrator | 2026-04-07 00:55:22.768811 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release kraken] ******************** 2026-04-07 00:55:22.768835 | orchestrator | Tuesday 07 April 2026 00:47:25 +0000 (0:00:00.601) 0:02:21.583 ********* 2026-04-07 00:55:22.768850 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768861 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768872 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768881 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.768892 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.768918 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.768929 | orchestrator | 2026-04-07 00:55:22.768940 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release luminous] ****************** 2026-04-07 00:55:22.768950 | orchestrator | Tuesday 07 April 2026 00:47:26 +0000 (0:00:00.785) 0:02:22.368 ********* 2026-04-07 00:55:22.768960 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.768971 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.768982 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.768992 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.769001 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.769007 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.769014 | orchestrator | 2026-04-07 00:55:22.769020 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release mimic] ********************* 2026-04-07 00:55:22.769026 | orchestrator | Tuesday 07 April 2026 00:47:26 +0000 (0:00:00.584) 0:02:22.953 ********* 2026-04-07 00:55:22.769033 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.769039 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.769045 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.769051 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.769058 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.769064 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.769070 | orchestrator | 2026-04-07 00:55:22.769076 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release nautilus] ****************** 2026-04-07 00:55:22.769083 | orchestrator | Tuesday 07 April 2026 00:47:27 +0000 (0:00:00.778) 0:02:23.731 ********* 2026-04-07 00:55:22.769089 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.769095 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.769101 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.769108 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.769114 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.769120 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.769127 | orchestrator | 2026-04-07 00:55:22.769133 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release octopus] ******************* 2026-04-07 00:55:22.769139 | orchestrator | Tuesday 07 April 2026 00:47:28 +0000 (0:00:00.646) 0:02:24.378 ********* 2026-04-07 00:55:22.769146 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.769152 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.769158 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.769164 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.769171 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.769177 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.769183 | orchestrator | 2026-04-07 00:55:22.769189 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release pacific] ******************* 2026-04-07 00:55:22.769196 | orchestrator | Tuesday 07 April 2026 00:47:28 +0000 (0:00:00.737) 0:02:25.115 ********* 2026-04-07 00:55:22.769202 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.769208 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.769214 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.769220 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.769232 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.769239 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.769245 | orchestrator | 2026-04-07 00:55:22.769252 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release quincy] ******************** 2026-04-07 00:55:22.769258 | orchestrator | Tuesday 07 April 2026 00:47:29 +0000 (0:00:00.607) 0:02:25.723 ********* 2026-04-07 00:55:22.769264 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.769271 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.769277 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.769283 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.769289 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.769295 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.769302 | orchestrator | 2026-04-07 00:55:22.769308 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release reef] ********************** 2026-04-07 00:55:22.769336 | orchestrator | Tuesday 07 April 2026 00:47:30 +0000 (0:00:00.939) 0:02:26.662 ********* 2026-04-07 00:55:22.769343 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.769350 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.769357 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.769363 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.769369 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.769376 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.769382 | orchestrator | 2026-04-07 00:55:22.769388 | orchestrator | TASK [ceph-config : Include create_ceph_initial_dirs.yml] ********************** 2026-04-07 00:55:22.769395 | orchestrator | Tuesday 07 April 2026 00:47:31 +0000 (0:00:01.176) 0:02:27.839 ********* 2026-04-07 00:55:22.769401 | orchestrator | included: /ansible/roles/ceph-config/tasks/create_ceph_initial_dirs.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.769408 | orchestrator | 2026-04-07 00:55:22.769415 | orchestrator | TASK [ceph-config : Create ceph initial directories] *************************** 2026-04-07 00:55:22.769421 | orchestrator | Tuesday 07 April 2026 00:47:32 +0000 (0:00:01.255) 0:02:29.094 ********* 2026-04-07 00:55:22.769427 | orchestrator | changed: [testbed-node-3] => (item=/etc/ceph) 2026-04-07 00:55:22.769433 | orchestrator | changed: [testbed-node-4] => (item=/etc/ceph) 2026-04-07 00:55:22.769440 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/) 2026-04-07 00:55:22.769446 | orchestrator | changed: [testbed-node-5] => (item=/etc/ceph) 2026-04-07 00:55:22.769452 | orchestrator | changed: [testbed-node-0] => (item=/etc/ceph) 2026-04-07 00:55:22.769459 | orchestrator | changed: [testbed-node-1] => (item=/etc/ceph) 2026-04-07 00:55:22.769465 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mon) 2026-04-07 00:55:22.769471 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/) 2026-04-07 00:55:22.769478 | orchestrator | changed: [testbed-node-2] => (item=/etc/ceph) 2026-04-07 00:55:22.769487 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/) 2026-04-07 00:55:22.769494 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/) 2026-04-07 00:55:22.769500 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/) 2026-04-07 00:55:22.769506 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mon) 2026-04-07 00:55:22.769513 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/) 2026-04-07 00:55:22.769519 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/osd) 2026-04-07 00:55:22.769525 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mon) 2026-04-07 00:55:22.769531 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mon) 2026-04-07 00:55:22.769538 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mon) 2026-04-07 00:55:22.769544 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/osd) 2026-04-07 00:55:22.769550 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds) 2026-04-07 00:55:22.769557 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mon) 2026-04-07 00:55:22.769563 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/osd) 2026-04-07 00:55:22.769573 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/osd) 2026-04-07 00:55:22.769579 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/osd) 2026-04-07 00:55:22.769586 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds) 2026-04-07 00:55:22.769592 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/osd) 2026-04-07 00:55:22.769598 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/tmp) 2026-04-07 00:55:22.769604 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds) 2026-04-07 00:55:22.769611 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mds) 2026-04-07 00:55:22.769617 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mds) 2026-04-07 00:55:22.769623 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/tmp) 2026-04-07 00:55:22.769629 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/crash) 2026-04-07 00:55:22.769637 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mds) 2026-04-07 00:55:22.769648 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/tmp) 2026-04-07 00:55:22.769662 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/tmp) 2026-04-07 00:55:22.769676 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/tmp) 2026-04-07 00:55:22.769686 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/radosgw) 2026-04-07 00:55:22.769697 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/crash) 2026-04-07 00:55:22.769707 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/tmp) 2026-04-07 00:55:22.769718 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/crash) 2026-04-07 00:55:22.769727 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/crash) 2026-04-07 00:55:22.769739 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/crash) 2026-04-07 00:55:22.769750 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-07 00:55:22.769761 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/radosgw) 2026-04-07 00:55:22.769771 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/crash) 2026-04-07 00:55:22.769781 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/radosgw) 2026-04-07 00:55:22.769787 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/radosgw) 2026-04-07 00:55:22.769794 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-07 00:55:22.769800 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/radosgw) 2026-04-07 00:55:22.769806 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-07 00:55:22.769813 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-07 00:55:22.769840 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/radosgw) 2026-04-07 00:55:22.769847 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-07 00:55:22.769853 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-07 00:55:22.769860 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-07 00:55:22.769866 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-07 00:55:22.769872 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-07 00:55:22.769879 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-07 00:55:22.769885 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-07 00:55:22.769891 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-07 00:55:22.769910 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-07 00:55:22.769917 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-07 00:55:22.769924 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-07 00:55:22.769930 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-07 00:55:22.769945 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-07 00:55:22.769952 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-07 00:55:22.769958 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-07 00:55:22.769965 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-07 00:55:22.769971 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-07 00:55:22.769981 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-07 00:55:22.769988 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-07 00:55:22.769994 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-07 00:55:22.770000 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-07 00:55:22.770007 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-07 00:55:22.770034 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-07 00:55:22.770042 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-07 00:55:22.770049 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-07 00:55:22.770055 | orchestrator | changed: [testbed-node-3] => (item=/var/run/ceph) 2026-04-07 00:55:22.770061 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-07 00:55:22.770068 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-07 00:55:22.770074 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-07 00:55:22.770080 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-07 00:55:22.770086 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-07 00:55:22.770092 | orchestrator | changed: [testbed-node-3] => (item=/var/log/ceph) 2026-04-07 00:55:22.770099 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-07 00:55:22.770105 | orchestrator | changed: [testbed-node-4] => (item=/var/run/ceph) 2026-04-07 00:55:22.770111 | orchestrator | changed: [testbed-node-1] => (item=/var/run/ceph) 2026-04-07 00:55:22.770117 | orchestrator | changed: [testbed-node-5] => (item=/var/run/ceph) 2026-04-07 00:55:22.770124 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-07 00:55:22.770130 | orchestrator | changed: [testbed-node-4] => (item=/var/log/ceph) 2026-04-07 00:55:22.770136 | orchestrator | changed: [testbed-node-0] => (item=/var/run/ceph) 2026-04-07 00:55:22.770142 | orchestrator | changed: [testbed-node-1] => (item=/var/log/ceph) 2026-04-07 00:55:22.770148 | orchestrator | changed: [testbed-node-5] => (item=/var/log/ceph) 2026-04-07 00:55:22.770154 | orchestrator | changed: [testbed-node-2] => (item=/var/run/ceph) 2026-04-07 00:55:22.770161 | orchestrator | changed: [testbed-node-0] => (item=/var/log/ceph) 2026-04-07 00:55:22.770167 | orchestrator | changed: [testbed-node-2] => (item=/var/log/ceph) 2026-04-07 00:55:22.770173 | orchestrator | 2026-04-07 00:55:22.770179 | orchestrator | TASK [ceph-config : Include_tasks rgw_systemd_environment_file.yml] ************ 2026-04-07 00:55:22.770186 | orchestrator | Tuesday 07 April 2026 00:47:39 +0000 (0:00:06.964) 0:02:36.059 ********* 2026-04-07 00:55:22.770192 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770199 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770205 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770211 | orchestrator | included: /ansible/roles/ceph-config/tasks/rgw_systemd_environment_file.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.770218 | orchestrator | 2026-04-07 00:55:22.770224 | orchestrator | TASK [ceph-config : Create rados gateway instance directories] ***************** 2026-04-07 00:55:22.770230 | orchestrator | Tuesday 07 April 2026 00:47:40 +0000 (0:00:01.084) 0:02:37.143 ********* 2026-04-07 00:55:22.770241 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.770248 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.770271 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.770278 | orchestrator | 2026-04-07 00:55:22.770284 | orchestrator | TASK [ceph-config : Generate environment file] ********************************* 2026-04-07 00:55:22.770291 | orchestrator | Tuesday 07 April 2026 00:47:41 +0000 (0:00:00.821) 0:02:37.965 ********* 2026-04-07 00:55:22.770297 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.770303 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.770310 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.770316 | orchestrator | 2026-04-07 00:55:22.770322 | orchestrator | TASK [ceph-config : Reset num_osds] ******************************************** 2026-04-07 00:55:22.770329 | orchestrator | Tuesday 07 April 2026 00:47:42 +0000 (0:00:01.097) 0:02:39.062 ********* 2026-04-07 00:55:22.770335 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.770341 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.770348 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.770354 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770360 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770367 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770373 | orchestrator | 2026-04-07 00:55:22.770379 | orchestrator | TASK [ceph-config : Count number of osds for lvm scenario] ********************* 2026-04-07 00:55:22.770385 | orchestrator | Tuesday 07 April 2026 00:47:43 +0000 (0:00:01.018) 0:02:40.081 ********* 2026-04-07 00:55:22.770392 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.770398 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.770404 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.770413 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770420 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770426 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770432 | orchestrator | 2026-04-07 00:55:22.770439 | orchestrator | TASK [ceph-config : Look up for ceph-volume rejected devices] ****************** 2026-04-07 00:55:22.770445 | orchestrator | Tuesday 07 April 2026 00:47:44 +0000 (0:00:00.638) 0:02:40.719 ********* 2026-04-07 00:55:22.770451 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.770457 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.770463 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.770470 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770476 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770482 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770488 | orchestrator | 2026-04-07 00:55:22.770495 | orchestrator | TASK [ceph-config : Set_fact rejected_devices] ********************************* 2026-04-07 00:55:22.770501 | orchestrator | Tuesday 07 April 2026 00:47:45 +0000 (0:00:00.909) 0:02:41.629 ********* 2026-04-07 00:55:22.770507 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.770513 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.770519 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.770526 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770532 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770538 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770544 | orchestrator | 2026-04-07 00:55:22.770551 | orchestrator | TASK [ceph-config : Set_fact _devices] ***************************************** 2026-04-07 00:55:22.770557 | orchestrator | Tuesday 07 April 2026 00:47:45 +0000 (0:00:00.590) 0:02:42.219 ********* 2026-04-07 00:55:22.770567 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.770573 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.770579 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.770586 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770592 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770598 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770604 | orchestrator | 2026-04-07 00:55:22.770611 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm batch --report' to see how many osds are to be created] *** 2026-04-07 00:55:22.770617 | orchestrator | Tuesday 07 April 2026 00:47:46 +0000 (0:00:00.837) 0:02:43.057 ********* 2026-04-07 00:55:22.770623 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.770630 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.770636 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.770642 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770648 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770655 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770661 | orchestrator | 2026-04-07 00:55:22.770668 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (legacy report)] *** 2026-04-07 00:55:22.770674 | orchestrator | Tuesday 07 April 2026 00:47:47 +0000 (0:00:00.553) 0:02:43.611 ********* 2026-04-07 00:55:22.770680 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.770686 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.770692 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.770699 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770705 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770711 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770717 | orchestrator | 2026-04-07 00:55:22.770724 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (new report)] *** 2026-04-07 00:55:22.770730 | orchestrator | Tuesday 07 April 2026 00:47:47 +0000 (0:00:00.584) 0:02:44.196 ********* 2026-04-07 00:55:22.770736 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.770743 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.770749 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.770755 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770761 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770767 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770774 | orchestrator | 2026-04-07 00:55:22.770780 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created] *** 2026-04-07 00:55:22.770786 | orchestrator | Tuesday 07 April 2026 00:47:48 +0000 (0:00:00.503) 0:02:44.700 ********* 2026-04-07 00:55:22.770793 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770799 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770821 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770828 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.770834 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.770840 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.770847 | orchestrator | 2026-04-07 00:55:22.770853 | orchestrator | TASK [ceph-config : Set_fact num_osds (add existing osds)] ********************* 2026-04-07 00:55:22.770860 | orchestrator | Tuesday 07 April 2026 00:47:51 +0000 (0:00:02.980) 0:02:47.681 ********* 2026-04-07 00:55:22.770866 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.770872 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.770878 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.770885 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770891 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770930 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770937 | orchestrator | 2026-04-07 00:55:22.770944 | orchestrator | TASK [ceph-config : Set_fact _osd_memory_target] ******************************* 2026-04-07 00:55:22.770950 | orchestrator | Tuesday 07 April 2026 00:47:52 +0000 (0:00:00.926) 0:02:48.607 ********* 2026-04-07 00:55:22.770957 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.770963 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.770973 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.770980 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.770986 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.770992 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.770999 | orchestrator | 2026-04-07 00:55:22.771005 | orchestrator | TASK [ceph-config : Set osd_memory_target to cluster host config] ************** 2026-04-07 00:55:22.771011 | orchestrator | Tuesday 07 April 2026 00:47:53 +0000 (0:00:00.749) 0:02:49.357 ********* 2026-04-07 00:55:22.771018 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771024 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771030 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771036 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771043 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771049 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771055 | orchestrator | 2026-04-07 00:55:22.771065 | orchestrator | TASK [ceph-config : Render rgw configs] **************************************** 2026-04-07 00:55:22.771072 | orchestrator | Tuesday 07 April 2026 00:47:54 +0000 (0:00:00.959) 0:02:50.316 ********* 2026-04-07 00:55:22.771078 | orchestrator | ok: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.771085 | orchestrator | ok: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.771091 | orchestrator | ok: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.771097 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771104 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771110 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771117 | orchestrator | 2026-04-07 00:55:22.771123 | orchestrator | TASK [ceph-config : Set config to cluster] ************************************* 2026-04-07 00:55:22.771129 | orchestrator | Tuesday 07 April 2026 00:47:54 +0000 (0:00:00.598) 0:02:50.915 ********* 2026-04-07 00:55:22.771137 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log'}])  2026-04-07 00:55:22.771145 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.13:8081'}])  2026-04-07 00:55:22.771152 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771159 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log'}])  2026-04-07 00:55:22.771166 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.14:8081'}])  2026-04-07 00:55:22.771172 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771179 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log'}])  2026-04-07 00:55:22.771185 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.15:8081'}])  2026-04-07 00:55:22.771213 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771221 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771227 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771233 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771240 | orchestrator | 2026-04-07 00:55:22.771246 | orchestrator | TASK [ceph-config : Set rgw configs to file] *********************************** 2026-04-07 00:55:22.771252 | orchestrator | Tuesday 07 April 2026 00:47:55 +0000 (0:00:00.869) 0:02:51.784 ********* 2026-04-07 00:55:22.771259 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771265 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771271 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771278 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771284 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771290 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771296 | orchestrator | 2026-04-07 00:55:22.771303 | orchestrator | TASK [ceph-config : Create ceph conf directory] ******************************** 2026-04-07 00:55:22.771309 | orchestrator | Tuesday 07 April 2026 00:47:56 +0000 (0:00:00.591) 0:02:52.375 ********* 2026-04-07 00:55:22.771315 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771321 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771328 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771334 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771340 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771347 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771353 | orchestrator | 2026-04-07 00:55:22.771359 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-04-07 00:55:22.771366 | orchestrator | Tuesday 07 April 2026 00:47:56 +0000 (0:00:00.753) 0:02:53.129 ********* 2026-04-07 00:55:22.771372 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771378 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771385 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771391 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771401 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771407 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771413 | orchestrator | 2026-04-07 00:55:22.771420 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-04-07 00:55:22.771426 | orchestrator | Tuesday 07 April 2026 00:47:57 +0000 (0:00:00.569) 0:02:53.699 ********* 2026-04-07 00:55:22.771432 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771439 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771445 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771451 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771457 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771463 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771470 | orchestrator | 2026-04-07 00:55:22.771476 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-04-07 00:55:22.771482 | orchestrator | Tuesday 07 April 2026 00:47:58 +0000 (0:00:00.773) 0:02:54.473 ********* 2026-04-07 00:55:22.771489 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771495 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771501 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.771507 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771514 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771520 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771526 | orchestrator | 2026-04-07 00:55:22.771533 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-04-07 00:55:22.771539 | orchestrator | Tuesday 07 April 2026 00:47:58 +0000 (0:00:00.571) 0:02:55.044 ********* 2026-04-07 00:55:22.771573 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.771579 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.771585 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.771590 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771596 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771601 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771606 | orchestrator | 2026-04-07 00:55:22.771612 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-04-07 00:55:22.771618 | orchestrator | Tuesday 07 April 2026 00:47:59 +0000 (0:00:01.075) 0:02:56.120 ********* 2026-04-07 00:55:22.771623 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.771629 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.771634 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.771639 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771645 | orchestrator | 2026-04-07 00:55:22.771650 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-04-07 00:55:22.771656 | orchestrator | Tuesday 07 April 2026 00:48:00 +0000 (0:00:00.409) 0:02:56.530 ********* 2026-04-07 00:55:22.771661 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.771667 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.771672 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.771678 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771683 | orchestrator | 2026-04-07 00:55:22.771689 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-04-07 00:55:22.771694 | orchestrator | Tuesday 07 April 2026 00:48:00 +0000 (0:00:00.391) 0:02:56.921 ********* 2026-04-07 00:55:22.771700 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.771705 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.771711 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.771716 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771722 | orchestrator | 2026-04-07 00:55:22.771727 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-04-07 00:55:22.771733 | orchestrator | Tuesday 07 April 2026 00:48:01 +0000 (0:00:00.376) 0:02:57.298 ********* 2026-04-07 00:55:22.771738 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.771744 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.771749 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.771755 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771760 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771766 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771771 | orchestrator | 2026-04-07 00:55:22.771790 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-04-07 00:55:22.771797 | orchestrator | Tuesday 07 April 2026 00:48:02 +0000 (0:00:00.995) 0:02:58.293 ********* 2026-04-07 00:55:22.771802 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-04-07 00:55:22.771808 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-07 00:55:22.771814 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-07 00:55:22.771819 | orchestrator | skipping: [testbed-node-0] => (item=0)  2026-04-07 00:55:22.771825 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.771830 | orchestrator | skipping: [testbed-node-1] => (item=0)  2026-04-07 00:55:22.771836 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.771841 | orchestrator | skipping: [testbed-node-2] => (item=0)  2026-04-07 00:55:22.771847 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.771852 | orchestrator | 2026-04-07 00:55:22.771858 | orchestrator | TASK [ceph-config : Generate Ceph file] **************************************** 2026-04-07 00:55:22.771863 | orchestrator | Tuesday 07 April 2026 00:48:03 +0000 (0:00:01.642) 0:02:59.935 ********* 2026-04-07 00:55:22.771869 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.771874 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.771883 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.771889 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.771894 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.771910 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.771916 | orchestrator | 2026-04-07 00:55:22.771922 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-07 00:55:22.771928 | orchestrator | Tuesday 07 April 2026 00:48:05 +0000 (0:00:02.313) 0:03:02.249 ********* 2026-04-07 00:55:22.771933 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.771939 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.771944 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.771950 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.771955 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.771963 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.771969 | orchestrator | 2026-04-07 00:55:22.771974 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2026-04-07 00:55:22.771980 | orchestrator | Tuesday 07 April 2026 00:48:06 +0000 (0:00:00.909) 0:03:03.159 ********* 2026-04-07 00:55:22.771986 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.771991 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.771997 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.772002 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.772008 | orchestrator | 2026-04-07 00:55:22.772013 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2026-04-07 00:55:22.772019 | orchestrator | Tuesday 07 April 2026 00:48:07 +0000 (0:00:00.679) 0:03:03.838 ********* 2026-04-07 00:55:22.772024 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.772030 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.772035 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.772041 | orchestrator | 2026-04-07 00:55:22.772046 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2026-04-07 00:55:22.772052 | orchestrator | Tuesday 07 April 2026 00:48:07 +0000 (0:00:00.212) 0:03:04.051 ********* 2026-04-07 00:55:22.772057 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.772063 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.772070 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.772079 | orchestrator | 2026-04-07 00:55:22.772088 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2026-04-07 00:55:22.772101 | orchestrator | Tuesday 07 April 2026 00:48:08 +0000 (0:00:01.103) 0:03:05.155 ********* 2026-04-07 00:55:22.772111 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:55:22.772119 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:55:22.772127 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:55:22.772136 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.772144 | orchestrator | 2026-04-07 00:55:22.772152 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2026-04-07 00:55:22.772161 | orchestrator | Tuesday 07 April 2026 00:48:09 +0000 (0:00:00.570) 0:03:05.725 ********* 2026-04-07 00:55:22.772170 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.772179 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.772189 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.772198 | orchestrator | 2026-04-07 00:55:22.772206 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2026-04-07 00:55:22.772214 | orchestrator | Tuesday 07 April 2026 00:48:09 +0000 (0:00:00.344) 0:03:06.069 ********* 2026-04-07 00:55:22.772223 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.772232 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.772242 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.772248 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.772253 | orchestrator | 2026-04-07 00:55:22.772259 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2026-04-07 00:55:22.772270 | orchestrator | Tuesday 07 April 2026 00:48:10 +0000 (0:00:01.075) 0:03:07.145 ********* 2026-04-07 00:55:22.772275 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.772281 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.772286 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.772292 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772297 | orchestrator | 2026-04-07 00:55:22.772303 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2026-04-07 00:55:22.772308 | orchestrator | Tuesday 07 April 2026 00:48:11 +0000 (0:00:00.405) 0:03:07.551 ********* 2026-04-07 00:55:22.772314 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772319 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.772325 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.772330 | orchestrator | 2026-04-07 00:55:22.772336 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2026-04-07 00:55:22.772362 | orchestrator | Tuesday 07 April 2026 00:48:11 +0000 (0:00:00.398) 0:03:07.950 ********* 2026-04-07 00:55:22.772368 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772374 | orchestrator | 2026-04-07 00:55:22.772379 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2026-04-07 00:55:22.772385 | orchestrator | Tuesday 07 April 2026 00:48:11 +0000 (0:00:00.203) 0:03:08.153 ********* 2026-04-07 00:55:22.772390 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772396 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.772402 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.772407 | orchestrator | 2026-04-07 00:55:22.772413 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2026-04-07 00:55:22.772418 | orchestrator | Tuesday 07 April 2026 00:48:12 +0000 (0:00:00.291) 0:03:08.445 ********* 2026-04-07 00:55:22.772424 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772429 | orchestrator | 2026-04-07 00:55:22.772435 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2026-04-07 00:55:22.772440 | orchestrator | Tuesday 07 April 2026 00:48:12 +0000 (0:00:00.208) 0:03:08.653 ********* 2026-04-07 00:55:22.772446 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772452 | orchestrator | 2026-04-07 00:55:22.772458 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2026-04-07 00:55:22.772463 | orchestrator | Tuesday 07 April 2026 00:48:12 +0000 (0:00:00.190) 0:03:08.844 ********* 2026-04-07 00:55:22.772469 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772474 | orchestrator | 2026-04-07 00:55:22.772480 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2026-04-07 00:55:22.772485 | orchestrator | Tuesday 07 April 2026 00:48:12 +0000 (0:00:00.338) 0:03:09.183 ********* 2026-04-07 00:55:22.772491 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772496 | orchestrator | 2026-04-07 00:55:22.772502 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2026-04-07 00:55:22.772511 | orchestrator | Tuesday 07 April 2026 00:48:13 +0000 (0:00:00.214) 0:03:09.398 ********* 2026-04-07 00:55:22.772517 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772522 | orchestrator | 2026-04-07 00:55:22.772528 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2026-04-07 00:55:22.772534 | orchestrator | Tuesday 07 April 2026 00:48:13 +0000 (0:00:00.256) 0:03:09.654 ********* 2026-04-07 00:55:22.772539 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.772545 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.772550 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.772556 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772561 | orchestrator | 2026-04-07 00:55:22.772567 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2026-04-07 00:55:22.772572 | orchestrator | Tuesday 07 April 2026 00:48:13 +0000 (0:00:00.410) 0:03:10.065 ********* 2026-04-07 00:55:22.772581 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772587 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.772592 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.772598 | orchestrator | 2026-04-07 00:55:22.772603 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2026-04-07 00:55:22.772609 | orchestrator | Tuesday 07 April 2026 00:48:14 +0000 (0:00:00.321) 0:03:10.386 ********* 2026-04-07 00:55:22.772614 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772620 | orchestrator | 2026-04-07 00:55:22.772625 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2026-04-07 00:55:22.772631 | orchestrator | Tuesday 07 April 2026 00:48:14 +0000 (0:00:00.207) 0:03:10.593 ********* 2026-04-07 00:55:22.772636 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772642 | orchestrator | 2026-04-07 00:55:22.772647 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2026-04-07 00:55:22.772653 | orchestrator | Tuesday 07 April 2026 00:48:14 +0000 (0:00:00.210) 0:03:10.804 ********* 2026-04-07 00:55:22.772658 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.772664 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.772669 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.772675 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.772680 | orchestrator | 2026-04-07 00:55:22.772686 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2026-04-07 00:55:22.772691 | orchestrator | Tuesday 07 April 2026 00:48:15 +0000 (0:00:01.192) 0:03:11.997 ********* 2026-04-07 00:55:22.772697 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.772702 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.772708 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.772713 | orchestrator | 2026-04-07 00:55:22.772719 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2026-04-07 00:55:22.772725 | orchestrator | Tuesday 07 April 2026 00:48:16 +0000 (0:00:00.350) 0:03:12.348 ********* 2026-04-07 00:55:22.772730 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.772736 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.772741 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.772747 | orchestrator | 2026-04-07 00:55:22.772752 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2026-04-07 00:55:22.772758 | orchestrator | Tuesday 07 April 2026 00:48:17 +0000 (0:00:01.132) 0:03:13.480 ********* 2026-04-07 00:55:22.772763 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.772769 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.772774 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.772780 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.772785 | orchestrator | 2026-04-07 00:55:22.772791 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2026-04-07 00:55:22.772796 | orchestrator | Tuesday 07 April 2026 00:48:18 +0000 (0:00:01.035) 0:03:14.515 ********* 2026-04-07 00:55:22.772802 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.772808 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.772813 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.772818 | orchestrator | 2026-04-07 00:55:22.772824 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2026-04-07 00:55:22.772844 | orchestrator | Tuesday 07 April 2026 00:48:18 +0000 (0:00:00.327) 0:03:14.843 ********* 2026-04-07 00:55:22.772850 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.772856 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.772861 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.772867 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.772872 | orchestrator | 2026-04-07 00:55:22.772879 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2026-04-07 00:55:22.772931 | orchestrator | Tuesday 07 April 2026 00:48:19 +0000 (0:00:00.983) 0:03:15.827 ********* 2026-04-07 00:55:22.772945 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.772953 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.772961 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.772971 | orchestrator | 2026-04-07 00:55:22.772980 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2026-04-07 00:55:22.772989 | orchestrator | Tuesday 07 April 2026 00:48:19 +0000 (0:00:00.327) 0:03:16.154 ********* 2026-04-07 00:55:22.772999 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.773008 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.773017 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.773027 | orchestrator | 2026-04-07 00:55:22.773036 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2026-04-07 00:55:22.773045 | orchestrator | Tuesday 07 April 2026 00:48:21 +0000 (0:00:01.170) 0:03:17.325 ********* 2026-04-07 00:55:22.773053 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.773061 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.773071 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.773080 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.773090 | orchestrator | 2026-04-07 00:55:22.773103 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2026-04-07 00:55:22.773113 | orchestrator | Tuesday 07 April 2026 00:48:21 +0000 (0:00:00.849) 0:03:18.175 ********* 2026-04-07 00:55:22.773119 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.773125 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.773130 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.773135 | orchestrator | 2026-04-07 00:55:22.773141 | orchestrator | RUNNING HANDLER [ceph-handler : Rbdmirrors handler] **************************** 2026-04-07 00:55:22.773146 | orchestrator | Tuesday 07 April 2026 00:48:22 +0000 (0:00:00.323) 0:03:18.498 ********* 2026-04-07 00:55:22.773152 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.773157 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.773163 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.773168 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773174 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773179 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773184 | orchestrator | 2026-04-07 00:55:22.773190 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2026-04-07 00:55:22.773195 | orchestrator | Tuesday 07 April 2026 00:48:23 +0000 (0:00:00.777) 0:03:19.276 ********* 2026-04-07 00:55:22.773201 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.773206 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.773211 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.773217 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.773222 | orchestrator | 2026-04-07 00:55:22.773228 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2026-04-07 00:55:22.773233 | orchestrator | Tuesday 07 April 2026 00:48:23 +0000 (0:00:00.956) 0:03:20.233 ********* 2026-04-07 00:55:22.773239 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773244 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773250 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773255 | orchestrator | 2026-04-07 00:55:22.773260 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2026-04-07 00:55:22.773266 | orchestrator | Tuesday 07 April 2026 00:48:24 +0000 (0:00:00.315) 0:03:20.548 ********* 2026-04-07 00:55:22.773271 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.773277 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.773282 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.773288 | orchestrator | 2026-04-07 00:55:22.773293 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2026-04-07 00:55:22.773304 | orchestrator | Tuesday 07 April 2026 00:48:25 +0000 (0:00:01.034) 0:03:21.583 ********* 2026-04-07 00:55:22.773310 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:55:22.773315 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:55:22.773320 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:55:22.773326 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773332 | orchestrator | 2026-04-07 00:55:22.773337 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2026-04-07 00:55:22.773343 | orchestrator | Tuesday 07 April 2026 00:48:25 +0000 (0:00:00.631) 0:03:22.215 ********* 2026-04-07 00:55:22.773348 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773354 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773359 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773365 | orchestrator | 2026-04-07 00:55:22.773370 | orchestrator | PLAY [Apply role ceph-mon] ***************************************************** 2026-04-07 00:55:22.773375 | orchestrator | 2026-04-07 00:55:22.773381 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.773386 | orchestrator | Tuesday 07 April 2026 00:48:26 +0000 (0:00:00.852) 0:03:23.067 ********* 2026-04-07 00:55:22.773392 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.773398 | orchestrator | 2026-04-07 00:55:22.773403 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.773408 | orchestrator | Tuesday 07 April 2026 00:48:27 +0000 (0:00:00.489) 0:03:23.557 ********* 2026-04-07 00:55:22.773434 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.773441 | orchestrator | 2026-04-07 00:55:22.773447 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.773452 | orchestrator | Tuesday 07 April 2026 00:48:27 +0000 (0:00:00.471) 0:03:24.028 ********* 2026-04-07 00:55:22.773458 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773463 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773469 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773474 | orchestrator | 2026-04-07 00:55:22.773480 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.773485 | orchestrator | Tuesday 07 April 2026 00:48:28 +0000 (0:00:00.926) 0:03:24.955 ********* 2026-04-07 00:55:22.773490 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773495 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773499 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773504 | orchestrator | 2026-04-07 00:55:22.773509 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.773514 | orchestrator | Tuesday 07 April 2026 00:48:28 +0000 (0:00:00.289) 0:03:25.245 ********* 2026-04-07 00:55:22.773519 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773524 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773529 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773533 | orchestrator | 2026-04-07 00:55:22.773538 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.773543 | orchestrator | Tuesday 07 April 2026 00:48:29 +0000 (0:00:00.286) 0:03:25.531 ********* 2026-04-07 00:55:22.773548 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773553 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773558 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773563 | orchestrator | 2026-04-07 00:55:22.773568 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.773575 | orchestrator | Tuesday 07 April 2026 00:48:29 +0000 (0:00:00.275) 0:03:25.807 ********* 2026-04-07 00:55:22.773580 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773585 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773590 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773598 | orchestrator | 2026-04-07 00:55:22.773603 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.773608 | orchestrator | Tuesday 07 April 2026 00:48:30 +0000 (0:00:00.893) 0:03:26.700 ********* 2026-04-07 00:55:22.773613 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773618 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773623 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773628 | orchestrator | 2026-04-07 00:55:22.773632 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.773637 | orchestrator | Tuesday 07 April 2026 00:48:30 +0000 (0:00:00.297) 0:03:26.998 ********* 2026-04-07 00:55:22.773642 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773647 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773652 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773657 | orchestrator | 2026-04-07 00:55:22.773662 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.773667 | orchestrator | Tuesday 07 April 2026 00:48:31 +0000 (0:00:00.275) 0:03:27.273 ********* 2026-04-07 00:55:22.773672 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773677 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773682 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773687 | orchestrator | 2026-04-07 00:55:22.773691 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.773696 | orchestrator | Tuesday 07 April 2026 00:48:31 +0000 (0:00:00.631) 0:03:27.905 ********* 2026-04-07 00:55:22.773701 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773706 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773711 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773716 | orchestrator | 2026-04-07 00:55:22.773721 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.773726 | orchestrator | Tuesday 07 April 2026 00:48:32 +0000 (0:00:01.084) 0:03:28.990 ********* 2026-04-07 00:55:22.773730 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773735 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773740 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773745 | orchestrator | 2026-04-07 00:55:22.773750 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.773758 | orchestrator | Tuesday 07 April 2026 00:48:33 +0000 (0:00:00.293) 0:03:29.283 ********* 2026-04-07 00:55:22.773770 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.773780 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.773787 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.773795 | orchestrator | 2026-04-07 00:55:22.773803 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.773810 | orchestrator | Tuesday 07 April 2026 00:48:33 +0000 (0:00:00.316) 0:03:29.600 ********* 2026-04-07 00:55:22.773818 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773826 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773834 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773843 | orchestrator | 2026-04-07 00:55:22.773852 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.773860 | orchestrator | Tuesday 07 April 2026 00:48:33 +0000 (0:00:00.310) 0:03:29.911 ********* 2026-04-07 00:55:22.773867 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773875 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773881 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773886 | orchestrator | 2026-04-07 00:55:22.773891 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.773895 | orchestrator | Tuesday 07 April 2026 00:48:33 +0000 (0:00:00.278) 0:03:30.189 ********* 2026-04-07 00:55:22.773916 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773921 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773926 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773931 | orchestrator | 2026-04-07 00:55:22.773936 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.773946 | orchestrator | Tuesday 07 April 2026 00:48:34 +0000 (0:00:00.541) 0:03:30.730 ********* 2026-04-07 00:55:22.773951 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.773956 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.773978 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.773984 | orchestrator | 2026-04-07 00:55:22.773989 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.773994 | orchestrator | Tuesday 07 April 2026 00:48:34 +0000 (0:00:00.310) 0:03:31.040 ********* 2026-04-07 00:55:22.773998 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.774003 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.774010 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.774045 | orchestrator | 2026-04-07 00:55:22.774054 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.774062 | orchestrator | Tuesday 07 April 2026 00:48:35 +0000 (0:00:00.308) 0:03:31.349 ********* 2026-04-07 00:55:22.774070 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774078 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774085 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774093 | orchestrator | 2026-04-07 00:55:22.774100 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.774108 | orchestrator | Tuesday 07 April 2026 00:48:35 +0000 (0:00:00.318) 0:03:31.667 ********* 2026-04-07 00:55:22.774116 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774125 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774133 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774141 | orchestrator | 2026-04-07 00:55:22.774149 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.774158 | orchestrator | Tuesday 07 April 2026 00:48:36 +0000 (0:00:00.615) 0:03:32.283 ********* 2026-04-07 00:55:22.774166 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774174 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774182 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774190 | orchestrator | 2026-04-07 00:55:22.774198 | orchestrator | TASK [ceph-mon : Set_fact container_exec_cmd] ********************************** 2026-04-07 00:55:22.774206 | orchestrator | Tuesday 07 April 2026 00:48:36 +0000 (0:00:00.543) 0:03:32.826 ********* 2026-04-07 00:55:22.774215 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774229 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774237 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774242 | orchestrator | 2026-04-07 00:55:22.774247 | orchestrator | TASK [ceph-mon : Include deploy_monitors.yml] ********************************** 2026-04-07 00:55:22.774252 | orchestrator | Tuesday 07 April 2026 00:48:36 +0000 (0:00:00.317) 0:03:33.144 ********* 2026-04-07 00:55:22.774257 | orchestrator | included: /ansible/roles/ceph-mon/tasks/deploy_monitors.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.774262 | orchestrator | 2026-04-07 00:55:22.774267 | orchestrator | TASK [ceph-mon : Check if monitor initial keyring already exists] ************** 2026-04-07 00:55:22.774272 | orchestrator | Tuesday 07 April 2026 00:48:37 +0000 (0:00:00.756) 0:03:33.900 ********* 2026-04-07 00:55:22.774277 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.774282 | orchestrator | 2026-04-07 00:55:22.774287 | orchestrator | TASK [ceph-mon : Generate monitor initial keyring] ***************************** 2026-04-07 00:55:22.774292 | orchestrator | Tuesday 07 April 2026 00:48:37 +0000 (0:00:00.143) 0:03:34.043 ********* 2026-04-07 00:55:22.774297 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-07 00:55:22.774302 | orchestrator | 2026-04-07 00:55:22.774307 | orchestrator | TASK [ceph-mon : Set_fact _initial_mon_key_success] **************************** 2026-04-07 00:55:22.774312 | orchestrator | Tuesday 07 April 2026 00:48:38 +0000 (0:00:01.042) 0:03:35.086 ********* 2026-04-07 00:55:22.774316 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774321 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774326 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774331 | orchestrator | 2026-04-07 00:55:22.774336 | orchestrator | TASK [ceph-mon : Get initial keyring when it already exists] ******************* 2026-04-07 00:55:22.774346 | orchestrator | Tuesday 07 April 2026 00:48:39 +0000 (0:00:00.325) 0:03:35.411 ********* 2026-04-07 00:55:22.774351 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774355 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774360 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774365 | orchestrator | 2026-04-07 00:55:22.774370 | orchestrator | TASK [ceph-mon : Create monitor initial keyring] ******************************* 2026-04-07 00:55:22.774375 | orchestrator | Tuesday 07 April 2026 00:48:39 +0000 (0:00:00.322) 0:03:35.734 ********* 2026-04-07 00:55:22.774380 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774385 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774390 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774394 | orchestrator | 2026-04-07 00:55:22.774399 | orchestrator | TASK [ceph-mon : Copy the initial key in /etc/ceph (for containers)] *********** 2026-04-07 00:55:22.774404 | orchestrator | Tuesday 07 April 2026 00:48:40 +0000 (0:00:01.301) 0:03:37.035 ********* 2026-04-07 00:55:22.774409 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774414 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774419 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774424 | orchestrator | 2026-04-07 00:55:22.774429 | orchestrator | TASK [ceph-mon : Create monitor directory] ************************************* 2026-04-07 00:55:22.774433 | orchestrator | Tuesday 07 April 2026 00:48:41 +0000 (0:00:00.741) 0:03:37.777 ********* 2026-04-07 00:55:22.774438 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774443 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774448 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774453 | orchestrator | 2026-04-07 00:55:22.774458 | orchestrator | TASK [ceph-mon : Recursively fix ownership of monitor directory] *************** 2026-04-07 00:55:22.774463 | orchestrator | Tuesday 07 April 2026 00:48:42 +0000 (0:00:00.694) 0:03:38.472 ********* 2026-04-07 00:55:22.774468 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774473 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774478 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774482 | orchestrator | 2026-04-07 00:55:22.774487 | orchestrator | TASK [ceph-mon : Create admin keyring] ***************************************** 2026-04-07 00:55:22.774492 | orchestrator | Tuesday 07 April 2026 00:48:42 +0000 (0:00:00.751) 0:03:39.223 ********* 2026-04-07 00:55:22.774497 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774502 | orchestrator | 2026-04-07 00:55:22.774507 | orchestrator | TASK [ceph-mon : Slurp admin keyring] ****************************************** 2026-04-07 00:55:22.774512 | orchestrator | Tuesday 07 April 2026 00:48:44 +0000 (0:00:01.254) 0:03:40.478 ********* 2026-04-07 00:55:22.774517 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774521 | orchestrator | 2026-04-07 00:55:22.774547 | orchestrator | TASK [ceph-mon : Copy admin keyring over to mons] ****************************** 2026-04-07 00:55:22.774552 | orchestrator | Tuesday 07 April 2026 00:48:45 +0000 (0:00:00.953) 0:03:41.431 ********* 2026-04-07 00:55:22.774557 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-07 00:55:22.774562 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.774567 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.774572 | orchestrator | changed: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:55:22.774577 | orchestrator | ok: [testbed-node-2 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:55:22.774582 | orchestrator | ok: [testbed-node-1] => (item=None) 2026-04-07 00:55:22.774586 | orchestrator | changed: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:55:22.774591 | orchestrator | changed: [testbed-node-0 -> {{ item }}] 2026-04-07 00:55:22.774596 | orchestrator | ok: [testbed-node-2] => (item=None) 2026-04-07 00:55:22.774601 | orchestrator | ok: [testbed-node-2 -> {{ item }}] 2026-04-07 00:55:22.774606 | orchestrator | ok: [testbed-node-1 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:55:22.774611 | orchestrator | ok: [testbed-node-1 -> {{ item }}] 2026-04-07 00:55:22.774620 | orchestrator | 2026-04-07 00:55:22.774625 | orchestrator | TASK [ceph-mon : Import admin keyring into mon keyring] ************************ 2026-04-07 00:55:22.774630 | orchestrator | Tuesday 07 April 2026 00:48:48 +0000 (0:00:03.743) 0:03:45.175 ********* 2026-04-07 00:55:22.774635 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774640 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774645 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774650 | orchestrator | 2026-04-07 00:55:22.774655 | orchestrator | TASK [ceph-mon : Set_fact ceph-mon container command] ************************** 2026-04-07 00:55:22.774662 | orchestrator | Tuesday 07 April 2026 00:48:49 +0000 (0:00:01.067) 0:03:46.242 ********* 2026-04-07 00:55:22.774667 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774672 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774677 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774682 | orchestrator | 2026-04-07 00:55:22.774687 | orchestrator | TASK [ceph-mon : Set_fact monmaptool container command] ************************ 2026-04-07 00:55:22.774692 | orchestrator | Tuesday 07 April 2026 00:48:50 +0000 (0:00:00.333) 0:03:46.576 ********* 2026-04-07 00:55:22.774697 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.774701 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.774706 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.774711 | orchestrator | 2026-04-07 00:55:22.774716 | orchestrator | TASK [ceph-mon : Generate initial monmap] ************************************** 2026-04-07 00:55:22.774721 | orchestrator | Tuesday 07 April 2026 00:48:50 +0000 (0:00:00.290) 0:03:46.866 ********* 2026-04-07 00:55:22.774726 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774731 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774736 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774741 | orchestrator | 2026-04-07 00:55:22.774746 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs with keyring] ******************************* 2026-04-07 00:55:22.774751 | orchestrator | Tuesday 07 April 2026 00:48:52 +0000 (0:00:02.373) 0:03:49.239 ********* 2026-04-07 00:55:22.774756 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774761 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774766 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774771 | orchestrator | 2026-04-07 00:55:22.774776 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs without keyring] **************************** 2026-04-07 00:55:22.774780 | orchestrator | Tuesday 07 April 2026 00:48:54 +0000 (0:00:01.453) 0:03:50.692 ********* 2026-04-07 00:55:22.774785 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.774790 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.774795 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.774800 | orchestrator | 2026-04-07 00:55:22.774805 | orchestrator | TASK [ceph-mon : Include start_monitor.yml] ************************************ 2026-04-07 00:55:22.774810 | orchestrator | Tuesday 07 April 2026 00:48:54 +0000 (0:00:00.324) 0:03:51.016 ********* 2026-04-07 00:55:22.774815 | orchestrator | included: /ansible/roles/ceph-mon/tasks/start_monitor.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.774820 | orchestrator | 2026-04-07 00:55:22.774825 | orchestrator | TASK [ceph-mon : Ensure systemd service override directory exists] ************* 2026-04-07 00:55:22.774830 | orchestrator | Tuesday 07 April 2026 00:48:55 +0000 (0:00:00.714) 0:03:51.731 ********* 2026-04-07 00:55:22.774834 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.774839 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.774844 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.774849 | orchestrator | 2026-04-07 00:55:22.774854 | orchestrator | TASK [ceph-mon : Add ceph-mon systemd service overrides] *********************** 2026-04-07 00:55:22.774859 | orchestrator | Tuesday 07 April 2026 00:48:55 +0000 (0:00:00.307) 0:03:52.038 ********* 2026-04-07 00:55:22.774864 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.774869 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.774874 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.774879 | orchestrator | 2026-04-07 00:55:22.774884 | orchestrator | TASK [ceph-mon : Include_tasks systemd.yml] ************************************ 2026-04-07 00:55:22.774893 | orchestrator | Tuesday 07 April 2026 00:48:56 +0000 (0:00:00.255) 0:03:52.293 ********* 2026-04-07 00:55:22.774928 | orchestrator | included: /ansible/roles/ceph-mon/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.774934 | orchestrator | 2026-04-07 00:55:22.774939 | orchestrator | TASK [ceph-mon : Generate systemd unit file for mon container] ***************** 2026-04-07 00:55:22.774944 | orchestrator | Tuesday 07 April 2026 00:48:56 +0000 (0:00:00.438) 0:03:52.732 ********* 2026-04-07 00:55:22.774949 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774954 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.774959 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.774964 | orchestrator | 2026-04-07 00:55:22.774969 | orchestrator | TASK [ceph-mon : Generate systemd ceph-mon target file] ************************ 2026-04-07 00:55:22.774974 | orchestrator | Tuesday 07 April 2026 00:48:58 +0000 (0:00:01.624) 0:03:54.356 ********* 2026-04-07 00:55:22.774979 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.774998 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.775004 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.775009 | orchestrator | 2026-04-07 00:55:22.775014 | orchestrator | TASK [ceph-mon : Enable ceph-mon.target] *************************************** 2026-04-07 00:55:22.775019 | orchestrator | Tuesday 07 April 2026 00:48:59 +0000 (0:00:01.276) 0:03:55.632 ********* 2026-04-07 00:55:22.775024 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.775028 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.775033 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.775038 | orchestrator | 2026-04-07 00:55:22.775043 | orchestrator | TASK [ceph-mon : Start the monitor service] ************************************ 2026-04-07 00:55:22.775048 | orchestrator | Tuesday 07 April 2026 00:49:01 +0000 (0:00:01.725) 0:03:57.358 ********* 2026-04-07 00:55:22.775053 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.775058 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.775063 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.775068 | orchestrator | 2026-04-07 00:55:22.775073 | orchestrator | TASK [ceph-mon : Include_tasks ceph_keys.yml] ********************************** 2026-04-07 00:55:22.775078 | orchestrator | Tuesday 07 April 2026 00:49:02 +0000 (0:00:01.813) 0:03:59.172 ********* 2026-04-07 00:55:22.775083 | orchestrator | included: /ansible/roles/ceph-mon/tasks/ceph_keys.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.775088 | orchestrator | 2026-04-07 00:55:22.775092 | orchestrator | TASK [ceph-mon : Waiting for the monitor(s) to form the quorum...] ************* 2026-04-07 00:55:22.775097 | orchestrator | Tuesday 07 April 2026 00:49:03 +0000 (0:00:00.806) 0:03:59.979 ********* 2026-04-07 00:55:22.775102 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775107 | orchestrator | 2026-04-07 00:55:22.775112 | orchestrator | TASK [ceph-mon : Fetch ceph initial keys] ************************************** 2026-04-07 00:55:22.775117 | orchestrator | Tuesday 07 April 2026 00:49:05 +0000 (0:00:01.331) 0:04:01.310 ********* 2026-04-07 00:55:22.775122 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775130 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775135 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775140 | orchestrator | 2026-04-07 00:55:22.775145 | orchestrator | TASK [ceph-mon : Include secure_cluster.yml] *********************************** 2026-04-07 00:55:22.775149 | orchestrator | Tuesday 07 April 2026 00:49:14 +0000 (0:00:09.714) 0:04:11.024 ********* 2026-04-07 00:55:22.775155 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775159 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775164 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775169 | orchestrator | 2026-04-07 00:55:22.775174 | orchestrator | TASK [ceph-mon : Set cluster configs] ****************************************** 2026-04-07 00:55:22.775179 | orchestrator | Tuesday 07 April 2026 00:49:15 +0000 (0:00:00.291) 0:04:11.316 ********* 2026-04-07 00:55:22.775185 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}}, {'key': 'public_network', 'value': '192.168.16.0/20'}]) 2026-04-07 00:55:22.775195 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}}, {'key': 'cluster_network', 'value': '192.168.16.0/20'}]) 2026-04-07 00:55:22.775201 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}}, {'key': 'osd_pool_default_crush_rule', 'value': -1}]) 2026-04-07 00:55:22.775207 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}}, {'key': 'ms_bind_ipv6', 'value': 'False'}]) 2026-04-07 00:55:22.775212 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}}, {'key': 'ms_bind_ipv4', 'value': 'True'}]) 2026-04-07 00:55:22.775231 | orchestrator | skipping: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}}, {'key': 'osd_crush_chooseleaf_type', 'value': '__omit_place_holder__ff7bf452e38827680e384a67330898f987a3f119'}])  2026-04-07 00:55:22.775241 | orchestrator | 2026-04-07 00:55:22.775250 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-07 00:55:22.775257 | orchestrator | Tuesday 07 April 2026 00:49:30 +0000 (0:00:15.880) 0:04:27.196 ********* 2026-04-07 00:55:22.775265 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775274 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775282 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775290 | orchestrator | 2026-04-07 00:55:22.775298 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2026-04-07 00:55:22.775305 | orchestrator | Tuesday 07 April 2026 00:49:31 +0000 (0:00:00.270) 0:04:27.466 ********* 2026-04-07 00:55:22.775312 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.775320 | orchestrator | 2026-04-07 00:55:22.775328 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2026-04-07 00:55:22.775336 | orchestrator | Tuesday 07 April 2026 00:49:31 +0000 (0:00:00.598) 0:04:28.065 ********* 2026-04-07 00:55:22.775345 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775353 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775362 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775370 | orchestrator | 2026-04-07 00:55:22.775379 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2026-04-07 00:55:22.775386 | orchestrator | Tuesday 07 April 2026 00:49:32 +0000 (0:00:00.307) 0:04:28.373 ********* 2026-04-07 00:55:22.775395 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775404 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775413 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775427 | orchestrator | 2026-04-07 00:55:22.775436 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2026-04-07 00:55:22.775443 | orchestrator | Tuesday 07 April 2026 00:49:32 +0000 (0:00:00.264) 0:04:28.637 ********* 2026-04-07 00:55:22.775450 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:55:22.775456 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:55:22.775460 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:55:22.775465 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775470 | orchestrator | 2026-04-07 00:55:22.775475 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2026-04-07 00:55:22.775480 | orchestrator | Tuesday 07 April 2026 00:49:33 +0000 (0:00:00.677) 0:04:29.315 ********* 2026-04-07 00:55:22.775485 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775490 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775495 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775499 | orchestrator | 2026-04-07 00:55:22.775504 | orchestrator | PLAY [Apply role ceph-mgr] ***************************************************** 2026-04-07 00:55:22.775509 | orchestrator | 2026-04-07 00:55:22.775513 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.775518 | orchestrator | Tuesday 07 April 2026 00:49:33 +0000 (0:00:00.630) 0:04:29.945 ********* 2026-04-07 00:55:22.775523 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.775527 | orchestrator | 2026-04-07 00:55:22.775532 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.775537 | orchestrator | Tuesday 07 April 2026 00:49:34 +0000 (0:00:00.345) 0:04:30.291 ********* 2026-04-07 00:55:22.775542 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.775546 | orchestrator | 2026-04-07 00:55:22.775551 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.775555 | orchestrator | Tuesday 07 April 2026 00:49:34 +0000 (0:00:00.474) 0:04:30.766 ********* 2026-04-07 00:55:22.775560 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775565 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775569 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775574 | orchestrator | 2026-04-07 00:55:22.775578 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.775583 | orchestrator | Tuesday 07 April 2026 00:49:35 +0000 (0:00:00.670) 0:04:31.436 ********* 2026-04-07 00:55:22.775588 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775592 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775597 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775602 | orchestrator | 2026-04-07 00:55:22.775606 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.775611 | orchestrator | Tuesday 07 April 2026 00:49:35 +0000 (0:00:00.261) 0:04:31.697 ********* 2026-04-07 00:55:22.775615 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775620 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775625 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775629 | orchestrator | 2026-04-07 00:55:22.775634 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.775638 | orchestrator | Tuesday 07 April 2026 00:49:35 +0000 (0:00:00.220) 0:04:31.918 ********* 2026-04-07 00:55:22.775643 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775648 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775652 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775657 | orchestrator | 2026-04-07 00:55:22.775661 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.775666 | orchestrator | Tuesday 07 April 2026 00:49:36 +0000 (0:00:00.397) 0:04:32.315 ********* 2026-04-07 00:55:22.775671 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775680 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775685 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775689 | orchestrator | 2026-04-07 00:55:22.775694 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.775699 | orchestrator | Tuesday 07 April 2026 00:49:36 +0000 (0:00:00.681) 0:04:32.996 ********* 2026-04-07 00:55:22.775703 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775708 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775713 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775717 | orchestrator | 2026-04-07 00:55:22.775739 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.775745 | orchestrator | Tuesday 07 April 2026 00:49:37 +0000 (0:00:00.281) 0:04:33.278 ********* 2026-04-07 00:55:22.775750 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775754 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775759 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775763 | orchestrator | 2026-04-07 00:55:22.775768 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.775773 | orchestrator | Tuesday 07 April 2026 00:49:37 +0000 (0:00:00.242) 0:04:33.520 ********* 2026-04-07 00:55:22.775777 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775782 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775787 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775791 | orchestrator | 2026-04-07 00:55:22.775796 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.775801 | orchestrator | Tuesday 07 April 2026 00:49:37 +0000 (0:00:00.742) 0:04:34.263 ********* 2026-04-07 00:55:22.775805 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775810 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775814 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775819 | orchestrator | 2026-04-07 00:55:22.775824 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.775828 | orchestrator | Tuesday 07 April 2026 00:49:38 +0000 (0:00:00.927) 0:04:35.191 ********* 2026-04-07 00:55:22.775833 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775838 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775842 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775847 | orchestrator | 2026-04-07 00:55:22.775852 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.775856 | orchestrator | Tuesday 07 April 2026 00:49:39 +0000 (0:00:00.249) 0:04:35.440 ********* 2026-04-07 00:55:22.775861 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.775865 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.775873 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.775878 | orchestrator | 2026-04-07 00:55:22.775882 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.775887 | orchestrator | Tuesday 07 April 2026 00:49:39 +0000 (0:00:00.267) 0:04:35.708 ********* 2026-04-07 00:55:22.775892 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775896 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775917 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775922 | orchestrator | 2026-04-07 00:55:22.775927 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.775931 | orchestrator | Tuesday 07 April 2026 00:49:39 +0000 (0:00:00.215) 0:04:35.924 ********* 2026-04-07 00:55:22.775936 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775941 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775945 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775950 | orchestrator | 2026-04-07 00:55:22.775954 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.775959 | orchestrator | Tuesday 07 April 2026 00:49:40 +0000 (0:00:00.371) 0:04:36.295 ********* 2026-04-07 00:55:22.775964 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.775968 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.775973 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.775981 | orchestrator | 2026-04-07 00:55:22.775986 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.775990 | orchestrator | Tuesday 07 April 2026 00:49:40 +0000 (0:00:00.237) 0:04:36.532 ********* 2026-04-07 00:55:22.775995 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776000 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776004 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776009 | orchestrator | 2026-04-07 00:55:22.776014 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.776018 | orchestrator | Tuesday 07 April 2026 00:49:40 +0000 (0:00:00.257) 0:04:36.790 ********* 2026-04-07 00:55:22.776023 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776028 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776032 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776037 | orchestrator | 2026-04-07 00:55:22.776042 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.776047 | orchestrator | Tuesday 07 April 2026 00:49:40 +0000 (0:00:00.250) 0:04:37.040 ********* 2026-04-07 00:55:22.776051 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.776056 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.776061 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776065 | orchestrator | 2026-04-07 00:55:22.776070 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.776075 | orchestrator | Tuesday 07 April 2026 00:49:41 +0000 (0:00:00.423) 0:04:37.464 ********* 2026-04-07 00:55:22.776079 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.776084 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.776089 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776093 | orchestrator | 2026-04-07 00:55:22.776098 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.776103 | orchestrator | Tuesday 07 April 2026 00:49:41 +0000 (0:00:00.277) 0:04:37.741 ********* 2026-04-07 00:55:22.776107 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.776112 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.776117 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776121 | orchestrator | 2026-04-07 00:55:22.776126 | orchestrator | TASK [ceph-mgr : Set_fact container_exec_cmd] ********************************** 2026-04-07 00:55:22.776131 | orchestrator | Tuesday 07 April 2026 00:49:41 +0000 (0:00:00.440) 0:04:38.182 ********* 2026-04-07 00:55:22.776136 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-07 00:55:22.776140 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:55:22.776145 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:55:22.776149 | orchestrator | 2026-04-07 00:55:22.776154 | orchestrator | TASK [ceph-mgr : Include common.yml] ******************************************* 2026-04-07 00:55:22.776159 | orchestrator | Tuesday 07 April 2026 00:49:42 +0000 (0:00:00.672) 0:04:38.855 ********* 2026-04-07 00:55:22.776177 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/common.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.776182 | orchestrator | 2026-04-07 00:55:22.776187 | orchestrator | TASK [ceph-mgr : Create mgr directory] ***************************************** 2026-04-07 00:55:22.776192 | orchestrator | Tuesday 07 April 2026 00:49:43 +0000 (0:00:00.603) 0:04:39.458 ********* 2026-04-07 00:55:22.776196 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.776201 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.776206 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.776210 | orchestrator | 2026-04-07 00:55:22.776215 | orchestrator | TASK [ceph-mgr : Fetch ceph mgr keyring] *************************************** 2026-04-07 00:55:22.776220 | orchestrator | Tuesday 07 April 2026 00:49:43 +0000 (0:00:00.586) 0:04:40.045 ********* 2026-04-07 00:55:22.776224 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776229 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776233 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776241 | orchestrator | 2026-04-07 00:55:22.776245 | orchestrator | TASK [ceph-mgr : Create ceph mgr keyring(s) on a mon node] ********************* 2026-04-07 00:55:22.776250 | orchestrator | Tuesday 07 April 2026 00:49:44 +0000 (0:00:00.328) 0:04:40.373 ********* 2026-04-07 00:55:22.776255 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-07 00:55:22.776259 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-07 00:55:22.776264 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-07 00:55:22.776268 | orchestrator | changed: [testbed-node-0 -> {{ groups[mon_group_name][0] }}] 2026-04-07 00:55:22.776273 | orchestrator | 2026-04-07 00:55:22.776277 | orchestrator | TASK [ceph-mgr : Set_fact _mgr_keys] ******************************************* 2026-04-07 00:55:22.776282 | orchestrator | Tuesday 07 April 2026 00:49:55 +0000 (0:00:11.115) 0:04:51.489 ********* 2026-04-07 00:55:22.776287 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.776292 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.776296 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776301 | orchestrator | 2026-04-07 00:55:22.776307 | orchestrator | TASK [ceph-mgr : Get keys from monitors] *************************************** 2026-04-07 00:55:22.776312 | orchestrator | Tuesday 07 April 2026 00:49:55 +0000 (0:00:00.477) 0:04:51.967 ********* 2026-04-07 00:55:22.776317 | orchestrator | skipping: [testbed-node-0] => (item=None)  2026-04-07 00:55:22.776321 | orchestrator | skipping: [testbed-node-1] => (item=None)  2026-04-07 00:55:22.776326 | orchestrator | skipping: [testbed-node-2] => (item=None)  2026-04-07 00:55:22.776331 | orchestrator | ok: [testbed-node-0] => (item=None) 2026-04-07 00:55:22.776335 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.776340 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.776344 | orchestrator | 2026-04-07 00:55:22.776349 | orchestrator | TASK [ceph-mgr : Copy ceph key(s) if needed] *********************************** 2026-04-07 00:55:22.776354 | orchestrator | Tuesday 07 April 2026 00:49:58 +0000 (0:00:02.371) 0:04:54.339 ********* 2026-04-07 00:55:22.776358 | orchestrator | skipping: [testbed-node-0] => (item=None)  2026-04-07 00:55:22.776363 | orchestrator | skipping: [testbed-node-1] => (item=None)  2026-04-07 00:55:22.776367 | orchestrator | skipping: [testbed-node-2] => (item=None)  2026-04-07 00:55:22.776372 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-04-07 00:55:22.776376 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-07 00:55:22.776381 | orchestrator | changed: [testbed-node-2] => (item=None) 2026-04-07 00:55:22.776386 | orchestrator | 2026-04-07 00:55:22.776390 | orchestrator | TASK [ceph-mgr : Set mgr key permissions] ************************************** 2026-04-07 00:55:22.776395 | orchestrator | Tuesday 07 April 2026 00:49:59 +0000 (0:00:01.238) 0:04:55.577 ********* 2026-04-07 00:55:22.776399 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.776404 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.776409 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776413 | orchestrator | 2026-04-07 00:55:22.776418 | orchestrator | TASK [ceph-mgr : Append dashboard modules to ceph_mgr_modules] ***************** 2026-04-07 00:55:22.776423 | orchestrator | Tuesday 07 April 2026 00:49:59 +0000 (0:00:00.627) 0:04:56.204 ********* 2026-04-07 00:55:22.776427 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776432 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776436 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776441 | orchestrator | 2026-04-07 00:55:22.776446 | orchestrator | TASK [ceph-mgr : Include pre_requisite.yml] ************************************ 2026-04-07 00:55:22.776450 | orchestrator | Tuesday 07 April 2026 00:50:00 +0000 (0:00:00.395) 0:04:56.600 ********* 2026-04-07 00:55:22.776455 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776459 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776464 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776468 | orchestrator | 2026-04-07 00:55:22.776473 | orchestrator | TASK [ceph-mgr : Include start_mgr.yml] **************************************** 2026-04-07 00:55:22.776478 | orchestrator | Tuesday 07 April 2026 00:50:00 +0000 (0:00:00.237) 0:04:56.838 ********* 2026-04-07 00:55:22.776485 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/start_mgr.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.776490 | orchestrator | 2026-04-07 00:55:22.776494 | orchestrator | TASK [ceph-mgr : Ensure systemd service override directory exists] ************* 2026-04-07 00:55:22.776499 | orchestrator | Tuesday 07 April 2026 00:50:00 +0000 (0:00:00.418) 0:04:57.257 ********* 2026-04-07 00:55:22.776504 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776508 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776513 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776517 | orchestrator | 2026-04-07 00:55:22.776522 | orchestrator | TASK [ceph-mgr : Add ceph-mgr systemd service overrides] *********************** 2026-04-07 00:55:22.776527 | orchestrator | Tuesday 07 April 2026 00:50:01 +0000 (0:00:00.271) 0:04:57.529 ********* 2026-04-07 00:55:22.776531 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776536 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776540 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.776545 | orchestrator | 2026-04-07 00:55:22.776550 | orchestrator | TASK [ceph-mgr : Include_tasks systemd.yml] ************************************ 2026-04-07 00:55:22.776554 | orchestrator | Tuesday 07 April 2026 00:50:01 +0000 (0:00:00.394) 0:04:57.923 ********* 2026-04-07 00:55:22.776570 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.776576 | orchestrator | 2026-04-07 00:55:22.776581 | orchestrator | TASK [ceph-mgr : Generate systemd unit file] *********************************** 2026-04-07 00:55:22.776585 | orchestrator | Tuesday 07 April 2026 00:50:02 +0000 (0:00:00.497) 0:04:58.421 ********* 2026-04-07 00:55:22.776590 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.776595 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.776599 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.776604 | orchestrator | 2026-04-07 00:55:22.776608 | orchestrator | TASK [ceph-mgr : Generate systemd ceph-mgr target file] ************************ 2026-04-07 00:55:22.776613 | orchestrator | Tuesday 07 April 2026 00:50:03 +0000 (0:00:01.122) 0:04:59.543 ********* 2026-04-07 00:55:22.776618 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.776622 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.776627 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.776631 | orchestrator | 2026-04-07 00:55:22.776636 | orchestrator | TASK [ceph-mgr : Enable ceph-mgr.target] *************************************** 2026-04-07 00:55:22.776641 | orchestrator | Tuesday 07 April 2026 00:50:04 +0000 (0:00:01.331) 0:05:00.874 ********* 2026-04-07 00:55:22.776645 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.776650 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.776655 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.776659 | orchestrator | 2026-04-07 00:55:22.776664 | orchestrator | TASK [ceph-mgr : Systemd start mgr] ******************************************** 2026-04-07 00:55:22.776668 | orchestrator | Tuesday 07 April 2026 00:50:06 +0000 (0:00:01.749) 0:05:02.624 ********* 2026-04-07 00:55:22.776673 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.776678 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.776682 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.776687 | orchestrator | 2026-04-07 00:55:22.776691 | orchestrator | TASK [ceph-mgr : Include mgr_modules.yml] ************************************** 2026-04-07 00:55:22.776698 | orchestrator | Tuesday 07 April 2026 00:50:08 +0000 (0:00:01.966) 0:05:04.590 ********* 2026-04-07 00:55:22.776703 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.776707 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.776712 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/mgr_modules.yml for testbed-node-2 2026-04-07 00:55:22.776717 | orchestrator | 2026-04-07 00:55:22.776721 | orchestrator | TASK [ceph-mgr : Wait for all mgr to be up] ************************************ 2026-04-07 00:55:22.776726 | orchestrator | Tuesday 07 April 2026 00:50:08 +0000 (0:00:00.326) 0:05:04.917 ********* 2026-04-07 00:55:22.776731 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (30 retries left). 2026-04-07 00:55:22.776738 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (29 retries left). 2026-04-07 00:55:22.776743 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (28 retries left). 2026-04-07 00:55:22.776748 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (27 retries left). 2026-04-07 00:55:22.776753 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (26 retries left). 2026-04-07 00:55:22.776757 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (25 retries left). 2026-04-07 00:55:22.776762 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.776766 | orchestrator | 2026-04-07 00:55:22.776771 | orchestrator | TASK [ceph-mgr : Get enabled modules from ceph-mgr] **************************** 2026-04-07 00:55:22.776776 | orchestrator | Tuesday 07 April 2026 00:50:45 +0000 (0:00:36.694) 0:05:41.611 ********* 2026-04-07 00:55:22.776780 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.776785 | orchestrator | 2026-04-07 00:55:22.776790 | orchestrator | TASK [ceph-mgr : Set _ceph_mgr_modules fact (convert _ceph_mgr_modules.stdout to a dict)] *** 2026-04-07 00:55:22.776794 | orchestrator | Tuesday 07 April 2026 00:50:46 +0000 (0:00:01.483) 0:05:43.094 ********* 2026-04-07 00:55:22.776799 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776804 | orchestrator | 2026-04-07 00:55:22.776808 | orchestrator | TASK [ceph-mgr : Set _disabled_ceph_mgr_modules fact] ************************** 2026-04-07 00:55:22.776813 | orchestrator | Tuesday 07 April 2026 00:50:47 +0000 (0:00:00.273) 0:05:43.368 ********* 2026-04-07 00:55:22.776817 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.776822 | orchestrator | 2026-04-07 00:55:22.776827 | orchestrator | TASK [ceph-mgr : Disable ceph mgr enabled modules] ***************************** 2026-04-07 00:55:22.776831 | orchestrator | Tuesday 07 April 2026 00:50:47 +0000 (0:00:00.127) 0:05:43.495 ********* 2026-04-07 00:55:22.776836 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=iostat) 2026-04-07 00:55:22.776840 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=nfs) 2026-04-07 00:55:22.776845 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=restful) 2026-04-07 00:55:22.776849 | orchestrator | 2026-04-07 00:55:22.776854 | orchestrator | TASK [ceph-mgr : Add modules to ceph-mgr] ************************************** 2026-04-07 00:55:22.776859 | orchestrator | Tuesday 07 April 2026 00:50:53 +0000 (0:00:06.328) 0:05:49.823 ********* 2026-04-07 00:55:22.776863 | orchestrator | skipping: [testbed-node-2] => (item=balancer)  2026-04-07 00:55:22.776868 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=dashboard) 2026-04-07 00:55:22.776873 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=prometheus) 2026-04-07 00:55:22.776877 | orchestrator | skipping: [testbed-node-2] => (item=status)  2026-04-07 00:55:22.776882 | orchestrator | 2026-04-07 00:55:22.776886 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-07 00:55:22.776891 | orchestrator | Tuesday 07 April 2026 00:50:58 +0000 (0:00:04.638) 0:05:54.462 ********* 2026-04-07 00:55:22.776925 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.776937 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.776944 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.776951 | orchestrator | 2026-04-07 00:55:22.776959 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2026-04-07 00:55:22.776966 | orchestrator | Tuesday 07 April 2026 00:50:59 +0000 (0:00:01.053) 0:05:55.515 ********* 2026-04-07 00:55:22.776972 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.776979 | orchestrator | 2026-04-07 00:55:22.776986 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2026-04-07 00:55:22.776998 | orchestrator | Tuesday 07 April 2026 00:50:59 +0000 (0:00:00.501) 0:05:56.017 ********* 2026-04-07 00:55:22.777006 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.777014 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.777022 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.777030 | orchestrator | 2026-04-07 00:55:22.777037 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2026-04-07 00:55:22.777045 | orchestrator | Tuesday 07 April 2026 00:51:00 +0000 (0:00:00.307) 0:05:56.325 ********* 2026-04-07 00:55:22.777053 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.777060 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.777068 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.777076 | orchestrator | 2026-04-07 00:55:22.777084 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2026-04-07 00:55:22.777092 | orchestrator | Tuesday 07 April 2026 00:51:01 +0000 (0:00:01.385) 0:05:57.711 ********* 2026-04-07 00:55:22.777100 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-07 00:55:22.777108 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-07 00:55:22.777115 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-07 00:55:22.777127 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.777133 | orchestrator | 2026-04-07 00:55:22.777141 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2026-04-07 00:55:22.777148 | orchestrator | Tuesday 07 April 2026 00:51:02 +0000 (0:00:00.605) 0:05:58.316 ********* 2026-04-07 00:55:22.777156 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.777164 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.777171 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.777179 | orchestrator | 2026-04-07 00:55:22.777187 | orchestrator | PLAY [Apply role ceph-osd] ***************************************************** 2026-04-07 00:55:22.777195 | orchestrator | 2026-04-07 00:55:22.777201 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.777206 | orchestrator | Tuesday 07 April 2026 00:51:02 +0000 (0:00:00.527) 0:05:58.844 ********* 2026-04-07 00:55:22.777211 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.777216 | orchestrator | 2026-04-07 00:55:22.777220 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.777225 | orchestrator | Tuesday 07 April 2026 00:51:03 +0000 (0:00:00.750) 0:05:59.595 ********* 2026-04-07 00:55:22.777230 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.777234 | orchestrator | 2026-04-07 00:55:22.777239 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.777243 | orchestrator | Tuesday 07 April 2026 00:51:03 +0000 (0:00:00.497) 0:06:00.092 ********* 2026-04-07 00:55:22.777248 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777253 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777257 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777262 | orchestrator | 2026-04-07 00:55:22.777267 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.777271 | orchestrator | Tuesday 07 April 2026 00:51:04 +0000 (0:00:00.280) 0:06:00.373 ********* 2026-04-07 00:55:22.777276 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777280 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777285 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777290 | orchestrator | 2026-04-07 00:55:22.777294 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.777299 | orchestrator | Tuesday 07 April 2026 00:51:05 +0000 (0:00:00.977) 0:06:01.351 ********* 2026-04-07 00:55:22.777304 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777308 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777313 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777317 | orchestrator | 2026-04-07 00:55:22.777326 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.777330 | orchestrator | Tuesday 07 April 2026 00:51:05 +0000 (0:00:00.738) 0:06:02.089 ********* 2026-04-07 00:55:22.777335 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777339 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777344 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777349 | orchestrator | 2026-04-07 00:55:22.777353 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.777358 | orchestrator | Tuesday 07 April 2026 00:51:06 +0000 (0:00:00.787) 0:06:02.876 ********* 2026-04-07 00:55:22.777362 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777367 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777372 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777376 | orchestrator | 2026-04-07 00:55:22.777381 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.777386 | orchestrator | Tuesday 07 April 2026 00:51:06 +0000 (0:00:00.274) 0:06:03.151 ********* 2026-04-07 00:55:22.777390 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777395 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777399 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777404 | orchestrator | 2026-04-07 00:55:22.777409 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.777413 | orchestrator | Tuesday 07 April 2026 00:51:07 +0000 (0:00:00.554) 0:06:03.705 ********* 2026-04-07 00:55:22.777418 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777422 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777446 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777451 | orchestrator | 2026-04-07 00:55:22.777456 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.777460 | orchestrator | Tuesday 07 April 2026 00:51:07 +0000 (0:00:00.295) 0:06:04.000 ********* 2026-04-07 00:55:22.777465 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777470 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777474 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777479 | orchestrator | 2026-04-07 00:55:22.777483 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.777488 | orchestrator | Tuesday 07 April 2026 00:51:08 +0000 (0:00:00.778) 0:06:04.779 ********* 2026-04-07 00:55:22.777493 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777497 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777502 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777506 | orchestrator | 2026-04-07 00:55:22.777511 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.777515 | orchestrator | Tuesday 07 April 2026 00:51:09 +0000 (0:00:00.740) 0:06:05.520 ********* 2026-04-07 00:55:22.777520 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777525 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777529 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777534 | orchestrator | 2026-04-07 00:55:22.777538 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.777543 | orchestrator | Tuesday 07 April 2026 00:51:09 +0000 (0:00:00.520) 0:06:06.041 ********* 2026-04-07 00:55:22.777548 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777552 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777557 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777561 | orchestrator | 2026-04-07 00:55:22.777566 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.777571 | orchestrator | Tuesday 07 April 2026 00:51:10 +0000 (0:00:00.297) 0:06:06.339 ********* 2026-04-07 00:55:22.777578 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777583 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777588 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777592 | orchestrator | 2026-04-07 00:55:22.777597 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.777604 | orchestrator | Tuesday 07 April 2026 00:51:10 +0000 (0:00:00.341) 0:06:06.680 ********* 2026-04-07 00:55:22.777609 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777614 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777618 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777623 | orchestrator | 2026-04-07 00:55:22.777627 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.777632 | orchestrator | Tuesday 07 April 2026 00:51:10 +0000 (0:00:00.303) 0:06:06.983 ********* 2026-04-07 00:55:22.777637 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777641 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777646 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777651 | orchestrator | 2026-04-07 00:55:22.777655 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.777660 | orchestrator | Tuesday 07 April 2026 00:51:11 +0000 (0:00:00.567) 0:06:07.551 ********* 2026-04-07 00:55:22.777664 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777669 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777674 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777678 | orchestrator | 2026-04-07 00:55:22.777683 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.777687 | orchestrator | Tuesday 07 April 2026 00:51:11 +0000 (0:00:00.317) 0:06:07.868 ********* 2026-04-07 00:55:22.777692 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777696 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777701 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777706 | orchestrator | 2026-04-07 00:55:22.777710 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.777715 | orchestrator | Tuesday 07 April 2026 00:51:11 +0000 (0:00:00.298) 0:06:08.167 ********* 2026-04-07 00:55:22.777719 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777724 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777728 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777733 | orchestrator | 2026-04-07 00:55:22.777738 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.777742 | orchestrator | Tuesday 07 April 2026 00:51:12 +0000 (0:00:00.305) 0:06:08.473 ********* 2026-04-07 00:55:22.777747 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777751 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777756 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777761 | orchestrator | 2026-04-07 00:55:22.777765 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.777770 | orchestrator | Tuesday 07 April 2026 00:51:12 +0000 (0:00:00.574) 0:06:09.047 ********* 2026-04-07 00:55:22.777774 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777779 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777783 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777788 | orchestrator | 2026-04-07 00:55:22.777793 | orchestrator | TASK [ceph-osd : Set_fact add_osd] ********************************************* 2026-04-07 00:55:22.777797 | orchestrator | Tuesday 07 April 2026 00:51:13 +0000 (0:00:00.592) 0:06:09.640 ********* 2026-04-07 00:55:22.777802 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777806 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777811 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777815 | orchestrator | 2026-04-07 00:55:22.777820 | orchestrator | TASK [ceph-osd : Set_fact container_exec_cmd] ********************************** 2026-04-07 00:55:22.777825 | orchestrator | Tuesday 07 April 2026 00:51:13 +0000 (0:00:00.311) 0:06:09.952 ********* 2026-04-07 00:55:22.777829 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:55:22.777834 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:55:22.777838 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:55:22.777843 | orchestrator | 2026-04-07 00:55:22.777847 | orchestrator | TASK [ceph-osd : Include_tasks system_tuning.yml] ****************************** 2026-04-07 00:55:22.777855 | orchestrator | Tuesday 07 April 2026 00:51:14 +0000 (0:00:00.894) 0:06:10.846 ********* 2026-04-07 00:55:22.777862 | orchestrator | included: /ansible/roles/ceph-osd/tasks/system_tuning.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.777867 | orchestrator | 2026-04-07 00:55:22.777871 | orchestrator | TASK [ceph-osd : Create tmpfiles.d directory] ********************************** 2026-04-07 00:55:22.777876 | orchestrator | Tuesday 07 April 2026 00:51:15 +0000 (0:00:00.785) 0:06:11.632 ********* 2026-04-07 00:55:22.777880 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777885 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777890 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777894 | orchestrator | 2026-04-07 00:55:22.777912 | orchestrator | TASK [ceph-osd : Disable transparent hugepage] ********************************* 2026-04-07 00:55:22.777920 | orchestrator | Tuesday 07 April 2026 00:51:15 +0000 (0:00:00.308) 0:06:11.941 ********* 2026-04-07 00:55:22.777928 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.777937 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.777945 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.777953 | orchestrator | 2026-04-07 00:55:22.777958 | orchestrator | TASK [ceph-osd : Get default vm.min_free_kbytes] ******************************* 2026-04-07 00:55:22.777963 | orchestrator | Tuesday 07 April 2026 00:51:16 +0000 (0:00:00.329) 0:06:12.271 ********* 2026-04-07 00:55:22.777968 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.777972 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.777977 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.777982 | orchestrator | 2026-04-07 00:55:22.777986 | orchestrator | TASK [ceph-osd : Set_fact vm_min_free_kbytes] ********************************** 2026-04-07 00:55:22.777991 | orchestrator | Tuesday 07 April 2026 00:51:16 +0000 (0:00:00.981) 0:06:13.252 ********* 2026-04-07 00:55:22.777995 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.778000 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.778005 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.778009 | orchestrator | 2026-04-07 00:55:22.778033 | orchestrator | TASK [ceph-osd : Apply operating system tuning] ******************************** 2026-04-07 00:55:22.778040 | orchestrator | Tuesday 07 April 2026 00:51:17 +0000 (0:00:00.305) 0:06:13.557 ********* 2026-04-07 00:55:22.778045 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-04-07 00:55:22.778050 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-04-07 00:55:22.778054 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-04-07 00:55:22.778059 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-04-07 00:55:22.778064 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-04-07 00:55:22.778068 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-04-07 00:55:22.778073 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-04-07 00:55:22.778078 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-04-07 00:55:22.778082 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-04-07 00:55:22.778087 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-04-07 00:55:22.778091 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-04-07 00:55:22.778096 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-04-07 00:55:22.778101 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-04-07 00:55:22.778105 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-04-07 00:55:22.778110 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-04-07 00:55:22.778120 | orchestrator | 2026-04-07 00:55:22.778125 | orchestrator | TASK [ceph-osd : Install dependencies] ***************************************** 2026-04-07 00:55:22.778129 | orchestrator | Tuesday 07 April 2026 00:51:20 +0000 (0:00:02.931) 0:06:16.489 ********* 2026-04-07 00:55:22.778134 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778139 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778143 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.778148 | orchestrator | 2026-04-07 00:55:22.778153 | orchestrator | TASK [ceph-osd : Include_tasks common.yml] ************************************* 2026-04-07 00:55:22.778157 | orchestrator | Tuesday 07 April 2026 00:51:20 +0000 (0:00:00.286) 0:06:16.775 ********* 2026-04-07 00:55:22.778162 | orchestrator | included: /ansible/roles/ceph-osd/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.778166 | orchestrator | 2026-04-07 00:55:22.778171 | orchestrator | TASK [ceph-osd : Create bootstrap-osd and osd directories] ********************* 2026-04-07 00:55:22.778176 | orchestrator | Tuesday 07 April 2026 00:51:21 +0000 (0:00:00.756) 0:06:17.531 ********* 2026-04-07 00:55:22.778180 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd/) 2026-04-07 00:55:22.778185 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd/) 2026-04-07 00:55:22.778189 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd/) 2026-04-07 00:55:22.778194 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/osd/) 2026-04-07 00:55:22.778199 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/osd/) 2026-04-07 00:55:22.778203 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/osd/) 2026-04-07 00:55:22.778208 | orchestrator | 2026-04-07 00:55:22.778212 | orchestrator | TASK [ceph-osd : Get keys from monitors] *************************************** 2026-04-07 00:55:22.778217 | orchestrator | Tuesday 07 April 2026 00:51:22 +0000 (0:00:01.023) 0:06:18.555 ********* 2026-04-07 00:55:22.778228 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.778233 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-07 00:55:22.778238 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:55:22.778242 | orchestrator | 2026-04-07 00:55:22.778247 | orchestrator | TASK [ceph-osd : Copy ceph key(s) if needed] *********************************** 2026-04-07 00:55:22.778251 | orchestrator | Tuesday 07 April 2026 00:51:24 +0000 (0:00:02.009) 0:06:20.564 ********* 2026-04-07 00:55:22.778256 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-07 00:55:22.778261 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-04-07 00:55:22.778265 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.778270 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-07 00:55:22.778275 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-07 00:55:22.778279 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.778284 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-07 00:55:22.778289 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-04-07 00:55:22.778293 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.778298 | orchestrator | 2026-04-07 00:55:22.778303 | orchestrator | TASK [ceph-osd : Set noup flag] ************************************************ 2026-04-07 00:55:22.778307 | orchestrator | Tuesday 07 April 2026 00:51:25 +0000 (0:00:01.038) 0:06:21.603 ********* 2026-04-07 00:55:22.778312 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.778316 | orchestrator | 2026-04-07 00:55:22.778321 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm.yml] ****************************** 2026-04-07 00:55:22.778326 | orchestrator | Tuesday 07 April 2026 00:51:28 +0000 (0:00:03.001) 0:06:24.605 ********* 2026-04-07 00:55:22.778330 | orchestrator | included: /ansible/roles/ceph-osd/tasks/scenarios/lvm.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.778335 | orchestrator | 2026-04-07 00:55:22.778342 | orchestrator | TASK [ceph-osd : Use ceph-volume to create osds] ******************************* 2026-04-07 00:55:22.778349 | orchestrator | Tuesday 07 April 2026 00:51:28 +0000 (0:00:00.447) 0:06:25.052 ********* 2026-04-07 00:55:22.778354 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-0842dd12-8111-558f-8152-9e8987e1446c', 'data_vg': 'ceph-0842dd12-8111-558f-8152-9e8987e1446c'}) 2026-04-07 00:55:22.778359 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-e0113da9-ca02-59fe-bdca-d5482abf5fe2', 'data_vg': 'ceph-e0113da9-ca02-59fe-bdca-d5482abf5fe2'}) 2026-04-07 00:55:22.778364 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-f75c5f18-ff10-5900-9978-917c146f798b', 'data_vg': 'ceph-f75c5f18-ff10-5900-9978-917c146f798b'}) 2026-04-07 00:55:22.778369 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-e59b5a6a-4894-5883-a5b3-f677d5bde0c7', 'data_vg': 'ceph-e59b5a6a-4894-5883-a5b3-f677d5bde0c7'}) 2026-04-07 00:55:22.778374 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-9eeb51fd-cca7-5129-bb0c-15bc93c67722', 'data_vg': 'ceph-9eeb51fd-cca7-5129-bb0c-15bc93c67722'}) 2026-04-07 00:55:22.778378 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-47815a29-012a-570b-a074-b4436c47a2f4', 'data_vg': 'ceph-47815a29-012a-570b-a074-b4436c47a2f4'}) 2026-04-07 00:55:22.778383 | orchestrator | 2026-04-07 00:55:22.778388 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm-batch.yml] ************************ 2026-04-07 00:55:22.778392 | orchestrator | Tuesday 07 April 2026 00:52:12 +0000 (0:00:43.424) 0:07:08.477 ********* 2026-04-07 00:55:22.778397 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778402 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778406 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.778411 | orchestrator | 2026-04-07 00:55:22.778415 | orchestrator | TASK [ceph-osd : Include_tasks start_osds.yml] ********************************* 2026-04-07 00:55:22.778420 | orchestrator | Tuesday 07 April 2026 00:52:12 +0000 (0:00:00.491) 0:07:08.968 ********* 2026-04-07 00:55:22.778425 | orchestrator | included: /ansible/roles/ceph-osd/tasks/start_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.778429 | orchestrator | 2026-04-07 00:55:22.778434 | orchestrator | TASK [ceph-osd : Get osd ids] ************************************************** 2026-04-07 00:55:22.778439 | orchestrator | Tuesday 07 April 2026 00:52:13 +0000 (0:00:00.490) 0:07:09.459 ********* 2026-04-07 00:55:22.778443 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.778448 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.778453 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.778457 | orchestrator | 2026-04-07 00:55:22.778462 | orchestrator | TASK [ceph-osd : Collect osd ids] ********************************************** 2026-04-07 00:55:22.778466 | orchestrator | Tuesday 07 April 2026 00:52:13 +0000 (0:00:00.644) 0:07:10.104 ********* 2026-04-07 00:55:22.778471 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.778476 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.778480 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.778485 | orchestrator | 2026-04-07 00:55:22.778490 | orchestrator | TASK [ceph-osd : Include_tasks systemd.yml] ************************************ 2026-04-07 00:55:22.778494 | orchestrator | Tuesday 07 April 2026 00:52:16 +0000 (0:00:02.937) 0:07:13.042 ********* 2026-04-07 00:55:22.778499 | orchestrator | included: /ansible/roles/ceph-osd/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.778504 | orchestrator | 2026-04-07 00:55:22.778508 | orchestrator | TASK [ceph-osd : Generate systemd unit file] *********************************** 2026-04-07 00:55:22.778513 | orchestrator | Tuesday 07 April 2026 00:52:17 +0000 (0:00:00.494) 0:07:13.537 ********* 2026-04-07 00:55:22.778518 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.778522 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.778527 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.778532 | orchestrator | 2026-04-07 00:55:22.778539 | orchestrator | TASK [ceph-osd : Generate systemd ceph-osd target file] ************************ 2026-04-07 00:55:22.778544 | orchestrator | Tuesday 07 April 2026 00:52:18 +0000 (0:00:01.135) 0:07:14.672 ********* 2026-04-07 00:55:22.778548 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.778556 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.778560 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.778565 | orchestrator | 2026-04-07 00:55:22.778570 | orchestrator | TASK [ceph-osd : Enable ceph-osd.target] *************************************** 2026-04-07 00:55:22.778574 | orchestrator | Tuesday 07 April 2026 00:52:19 +0000 (0:00:01.186) 0:07:15.858 ********* 2026-04-07 00:55:22.778579 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.778584 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.778588 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.778593 | orchestrator | 2026-04-07 00:55:22.778597 | orchestrator | TASK [ceph-osd : Ensure systemd service override directory exists] ************* 2026-04-07 00:55:22.778602 | orchestrator | Tuesday 07 April 2026 00:52:21 +0000 (0:00:02.123) 0:07:17.982 ********* 2026-04-07 00:55:22.778607 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778611 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778616 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.778620 | orchestrator | 2026-04-07 00:55:22.778625 | orchestrator | TASK [ceph-osd : Add ceph-osd systemd service overrides] *********************** 2026-04-07 00:55:22.778630 | orchestrator | Tuesday 07 April 2026 00:52:22 +0000 (0:00:00.311) 0:07:18.294 ********* 2026-04-07 00:55:22.778634 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778642 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778652 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.778663 | orchestrator | 2026-04-07 00:55:22.778670 | orchestrator | TASK [ceph-osd : Ensure /var/lib/ceph/osd/- is present] ********* 2026-04-07 00:55:22.778677 | orchestrator | Tuesday 07 April 2026 00:52:22 +0000 (0:00:00.295) 0:07:18.589 ********* 2026-04-07 00:55:22.778684 | orchestrator | ok: [testbed-node-3] => (item=1) 2026-04-07 00:55:22.778691 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-07 00:55:22.778702 | orchestrator | ok: [testbed-node-5] => (item=4) 2026-04-07 00:55:22.778710 | orchestrator | ok: [testbed-node-3] => (item=3) 2026-04-07 00:55:22.778718 | orchestrator | ok: [testbed-node-4] => (item=5) 2026-04-07 00:55:22.778726 | orchestrator | ok: [testbed-node-5] => (item=2) 2026-04-07 00:55:22.778734 | orchestrator | 2026-04-07 00:55:22.778742 | orchestrator | TASK [ceph-osd : Write run file in /var/lib/ceph/osd/xxxx/run] ***************** 2026-04-07 00:55:22.778750 | orchestrator | Tuesday 07 April 2026 00:52:23 +0000 (0:00:01.108) 0:07:19.697 ********* 2026-04-07 00:55:22.778755 | orchestrator | changed: [testbed-node-3] => (item=1) 2026-04-07 00:55:22.778760 | orchestrator | changed: [testbed-node-4] => (item=0) 2026-04-07 00:55:22.778764 | orchestrator | changed: [testbed-node-5] => (item=4) 2026-04-07 00:55:22.778769 | orchestrator | changed: [testbed-node-3] => (item=3) 2026-04-07 00:55:22.778774 | orchestrator | changed: [testbed-node-4] => (item=5) 2026-04-07 00:55:22.778778 | orchestrator | changed: [testbed-node-5] => (item=2) 2026-04-07 00:55:22.778783 | orchestrator | 2026-04-07 00:55:22.778788 | orchestrator | TASK [ceph-osd : Systemd start osd] ******************************************** 2026-04-07 00:55:22.778792 | orchestrator | Tuesday 07 April 2026 00:52:25 +0000 (0:00:02.562) 0:07:22.260 ********* 2026-04-07 00:55:22.778797 | orchestrator | changed: [testbed-node-3] => (item=1) 2026-04-07 00:55:22.778801 | orchestrator | changed: [testbed-node-4] => (item=0) 2026-04-07 00:55:22.778806 | orchestrator | changed: [testbed-node-5] => (item=4) 2026-04-07 00:55:22.778811 | orchestrator | changed: [testbed-node-4] => (item=5) 2026-04-07 00:55:22.778815 | orchestrator | changed: [testbed-node-3] => (item=3) 2026-04-07 00:55:22.778820 | orchestrator | changed: [testbed-node-5] => (item=2) 2026-04-07 00:55:22.778824 | orchestrator | 2026-04-07 00:55:22.778829 | orchestrator | TASK [ceph-osd : Unset noup flag] ********************************************** 2026-04-07 00:55:22.778834 | orchestrator | Tuesday 07 April 2026 00:52:29 +0000 (0:00:03.594) 0:07:25.855 ********* 2026-04-07 00:55:22.778838 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778843 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778847 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.778856 | orchestrator | 2026-04-07 00:55:22.778861 | orchestrator | TASK [ceph-osd : Wait for all osd to be up] ************************************ 2026-04-07 00:55:22.778866 | orchestrator | Tuesday 07 April 2026 00:52:32 +0000 (0:00:02.972) 0:07:28.827 ********* 2026-04-07 00:55:22.778870 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778875 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778879 | orchestrator | FAILED - RETRYING: [testbed-node-5 -> testbed-node-0]: Wait for all osd to be up (60 retries left). 2026-04-07 00:55:22.778884 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.778889 | orchestrator | 2026-04-07 00:55:22.778893 | orchestrator | TASK [ceph-osd : Include crush_rules.yml] ************************************** 2026-04-07 00:55:22.778924 | orchestrator | Tuesday 07 April 2026 00:52:45 +0000 (0:00:12.751) 0:07:41.578 ********* 2026-04-07 00:55:22.778929 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778934 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778939 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.778943 | orchestrator | 2026-04-07 00:55:22.778948 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-07 00:55:22.778953 | orchestrator | Tuesday 07 April 2026 00:52:46 +0000 (0:00:00.950) 0:07:42.528 ********* 2026-04-07 00:55:22.778957 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.778962 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.778967 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.778971 | orchestrator | 2026-04-07 00:55:22.778976 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2026-04-07 00:55:22.778980 | orchestrator | Tuesday 07 April 2026 00:52:46 +0000 (0:00:00.267) 0:07:42.795 ********* 2026-04-07 00:55:22.778985 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.778990 | orchestrator | 2026-04-07 00:55:22.778994 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2026-04-07 00:55:22.778999 | orchestrator | Tuesday 07 April 2026 00:52:47 +0000 (0:00:00.592) 0:07:43.387 ********* 2026-04-07 00:55:22.779008 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.779012 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.779017 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.779022 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779026 | orchestrator | 2026-04-07 00:55:22.779031 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2026-04-07 00:55:22.779036 | orchestrator | Tuesday 07 April 2026 00:52:47 +0000 (0:00:00.343) 0:07:43.731 ********* 2026-04-07 00:55:22.779040 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779045 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779050 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779054 | orchestrator | 2026-04-07 00:55:22.779061 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2026-04-07 00:55:22.779066 | orchestrator | Tuesday 07 April 2026 00:52:47 +0000 (0:00:00.260) 0:07:43.991 ********* 2026-04-07 00:55:22.779071 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779075 | orchestrator | 2026-04-07 00:55:22.779080 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2026-04-07 00:55:22.779084 | orchestrator | Tuesday 07 April 2026 00:52:47 +0000 (0:00:00.189) 0:07:44.181 ********* 2026-04-07 00:55:22.779089 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779093 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779098 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779103 | orchestrator | 2026-04-07 00:55:22.779107 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2026-04-07 00:55:22.779112 | orchestrator | Tuesday 07 April 2026 00:52:48 +0000 (0:00:00.445) 0:07:44.626 ********* 2026-04-07 00:55:22.779116 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779121 | orchestrator | 2026-04-07 00:55:22.779129 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2026-04-07 00:55:22.779136 | orchestrator | Tuesday 07 April 2026 00:52:48 +0000 (0:00:00.194) 0:07:44.820 ********* 2026-04-07 00:55:22.779141 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779146 | orchestrator | 2026-04-07 00:55:22.779150 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2026-04-07 00:55:22.779155 | orchestrator | Tuesday 07 April 2026 00:52:48 +0000 (0:00:00.184) 0:07:45.005 ********* 2026-04-07 00:55:22.779159 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779164 | orchestrator | 2026-04-07 00:55:22.779169 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2026-04-07 00:55:22.779174 | orchestrator | Tuesday 07 April 2026 00:52:48 +0000 (0:00:00.105) 0:07:45.111 ********* 2026-04-07 00:55:22.779178 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779184 | orchestrator | 2026-04-07 00:55:22.779191 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2026-04-07 00:55:22.779202 | orchestrator | Tuesday 07 April 2026 00:52:49 +0000 (0:00:00.189) 0:07:45.301 ********* 2026-04-07 00:55:22.779211 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779218 | orchestrator | 2026-04-07 00:55:22.779225 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2026-04-07 00:55:22.779232 | orchestrator | Tuesday 07 April 2026 00:52:49 +0000 (0:00:00.190) 0:07:45.491 ********* 2026-04-07 00:55:22.779239 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.779246 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.779254 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.779262 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779269 | orchestrator | 2026-04-07 00:55:22.779276 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2026-04-07 00:55:22.779283 | orchestrator | Tuesday 07 April 2026 00:52:49 +0000 (0:00:00.344) 0:07:45.836 ********* 2026-04-07 00:55:22.779290 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779298 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779305 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779313 | orchestrator | 2026-04-07 00:55:22.779320 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2026-04-07 00:55:22.779331 | orchestrator | Tuesday 07 April 2026 00:52:49 +0000 (0:00:00.280) 0:07:46.117 ********* 2026-04-07 00:55:22.779341 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779347 | orchestrator | 2026-04-07 00:55:22.779354 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2026-04-07 00:55:22.779362 | orchestrator | Tuesday 07 April 2026 00:52:50 +0000 (0:00:00.194) 0:07:46.311 ********* 2026-04-07 00:55:22.779369 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779376 | orchestrator | 2026-04-07 00:55:22.779383 | orchestrator | PLAY [Apply role ceph-crash] *************************************************** 2026-04-07 00:55:22.779390 | orchestrator | 2026-04-07 00:55:22.779396 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.779404 | orchestrator | Tuesday 07 April 2026 00:52:50 +0000 (0:00:00.920) 0:07:47.232 ********* 2026-04-07 00:55:22.779411 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.779419 | orchestrator | 2026-04-07 00:55:22.779427 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.779434 | orchestrator | Tuesday 07 April 2026 00:52:51 +0000 (0:00:00.944) 0:07:48.176 ********* 2026-04-07 00:55:22.779442 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.779450 | orchestrator | 2026-04-07 00:55:22.779458 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.779471 | orchestrator | Tuesday 07 April 2026 00:52:53 +0000 (0:00:01.162) 0:07:49.338 ********* 2026-04-07 00:55:22.779479 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779487 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779500 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779507 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.779514 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.779521 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.779527 | orchestrator | 2026-04-07 00:55:22.779535 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.779541 | orchestrator | Tuesday 07 April 2026 00:52:53 +0000 (0:00:00.862) 0:07:50.201 ********* 2026-04-07 00:55:22.779548 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.779555 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.779562 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.779569 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.779577 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.779584 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.779591 | orchestrator | 2026-04-07 00:55:22.779597 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.779604 | orchestrator | Tuesday 07 April 2026 00:52:54 +0000 (0:00:00.896) 0:07:51.098 ********* 2026-04-07 00:55:22.779611 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.779618 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.779625 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.779632 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.779639 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.779646 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.779653 | orchestrator | 2026-04-07 00:55:22.779661 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.779668 | orchestrator | Tuesday 07 April 2026 00:52:55 +0000 (0:00:00.629) 0:07:51.728 ********* 2026-04-07 00:55:22.779675 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.779682 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.779688 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.779695 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.779703 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.779709 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.779716 | orchestrator | 2026-04-07 00:55:22.779726 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.779734 | orchestrator | Tuesday 07 April 2026 00:52:56 +0000 (0:00:00.878) 0:07:52.606 ********* 2026-04-07 00:55:22.779742 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779749 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779755 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779762 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.779769 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.779776 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.779783 | orchestrator | 2026-04-07 00:55:22.779790 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.779796 | orchestrator | Tuesday 07 April 2026 00:52:57 +0000 (0:00:01.049) 0:07:53.656 ********* 2026-04-07 00:55:22.779803 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779811 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779818 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779825 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.779832 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.779839 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.779846 | orchestrator | 2026-04-07 00:55:22.779853 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.779860 | orchestrator | Tuesday 07 April 2026 00:52:58 +0000 (0:00:00.900) 0:07:54.556 ********* 2026-04-07 00:55:22.779867 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.779873 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.779884 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.779892 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.779911 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.779918 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.779925 | orchestrator | 2026-04-07 00:55:22.779932 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.779939 | orchestrator | Tuesday 07 April 2026 00:52:58 +0000 (0:00:00.623) 0:07:55.180 ********* 2026-04-07 00:55:22.779946 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.779953 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.779960 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.779967 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.779974 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.779981 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.779988 | orchestrator | 2026-04-07 00:55:22.779995 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.780002 | orchestrator | Tuesday 07 April 2026 00:53:00 +0000 (0:00:01.253) 0:07:56.433 ********* 2026-04-07 00:55:22.780009 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.780016 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.780023 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.780029 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.780036 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.780043 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.780050 | orchestrator | 2026-04-07 00:55:22.780057 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.780065 | orchestrator | Tuesday 07 April 2026 00:53:01 +0000 (0:00:01.022) 0:07:57.456 ********* 2026-04-07 00:55:22.780072 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.780079 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.780086 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.780093 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.780100 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.780106 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.780113 | orchestrator | 2026-04-07 00:55:22.780120 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.780127 | orchestrator | Tuesday 07 April 2026 00:53:02 +0000 (0:00:00.816) 0:07:58.272 ********* 2026-04-07 00:55:22.780134 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.780141 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.780148 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.780155 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.780162 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.780169 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.780176 | orchestrator | 2026-04-07 00:55:22.780183 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.780190 | orchestrator | Tuesday 07 April 2026 00:53:02 +0000 (0:00:00.560) 0:07:58.834 ********* 2026-04-07 00:55:22.780197 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.780204 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.780214 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.780222 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.780229 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.780236 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.780243 | orchestrator | 2026-04-07 00:55:22.780250 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.780257 | orchestrator | Tuesday 07 April 2026 00:53:03 +0000 (0:00:00.849) 0:07:59.683 ********* 2026-04-07 00:55:22.780264 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.780271 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.780278 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.780285 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.780292 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.780299 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.780306 | orchestrator | 2026-04-07 00:55:22.780313 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.780325 | orchestrator | Tuesday 07 April 2026 00:53:04 +0000 (0:00:00.621) 0:08:00.305 ********* 2026-04-07 00:55:22.780332 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.780339 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.780345 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.780352 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.780359 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.780366 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.780373 | orchestrator | 2026-04-07 00:55:22.780380 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.780387 | orchestrator | Tuesday 07 April 2026 00:53:04 +0000 (0:00:00.825) 0:08:01.131 ********* 2026-04-07 00:55:22.780394 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.780401 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.780408 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.780415 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.780422 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.780429 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.780436 | orchestrator | 2026-04-07 00:55:22.780446 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.780453 | orchestrator | Tuesday 07 April 2026 00:53:05 +0000 (0:00:00.648) 0:08:01.779 ********* 2026-04-07 00:55:22.780460 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.780467 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.780474 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.780481 | orchestrator | skipping: [testbed-node-0] 2026-04-07 00:55:22.780488 | orchestrator | skipping: [testbed-node-1] 2026-04-07 00:55:22.780495 | orchestrator | skipping: [testbed-node-2] 2026-04-07 00:55:22.780502 | orchestrator | 2026-04-07 00:55:22.780509 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.780516 | orchestrator | Tuesday 07 April 2026 00:53:06 +0000 (0:00:00.832) 0:08:02.612 ********* 2026-04-07 00:55:22.780523 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.780530 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.780537 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.780544 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.780551 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.780558 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.780565 | orchestrator | 2026-04-07 00:55:22.780572 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.780579 | orchestrator | Tuesday 07 April 2026 00:53:07 +0000 (0:00:00.724) 0:08:03.336 ********* 2026-04-07 00:55:22.780586 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.780592 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.780599 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.780606 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.780613 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.780620 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.780627 | orchestrator | 2026-04-07 00:55:22.780634 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.780641 | orchestrator | Tuesday 07 April 2026 00:53:08 +0000 (0:00:01.001) 0:08:04.338 ********* 2026-04-07 00:55:22.780648 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.780655 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.780662 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.780669 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.780675 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.780682 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.780689 | orchestrator | 2026-04-07 00:55:22.780696 | orchestrator | TASK [ceph-crash : Create client.crash keyring] ******************************** 2026-04-07 00:55:22.780703 | orchestrator | Tuesday 07 April 2026 00:53:09 +0000 (0:00:01.335) 0:08:05.674 ********* 2026-04-07 00:55:22.780710 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.780722 | orchestrator | 2026-04-07 00:55:22.780729 | orchestrator | TASK [ceph-crash : Get keys from monitors] ************************************* 2026-04-07 00:55:22.780736 | orchestrator | Tuesday 07 April 2026 00:53:13 +0000 (0:00:04.009) 0:08:09.684 ********* 2026-04-07 00:55:22.780743 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.780750 | orchestrator | 2026-04-07 00:55:22.780757 | orchestrator | TASK [ceph-crash : Copy ceph key(s) if needed] ********************************* 2026-04-07 00:55:22.780764 | orchestrator | Tuesday 07 April 2026 00:53:15 +0000 (0:00:02.047) 0:08:11.731 ********* 2026-04-07 00:55:22.780771 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.780778 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.780785 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.780792 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.780800 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.780807 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.780814 | orchestrator | 2026-04-07 00:55:22.780821 | orchestrator | TASK [ceph-crash : Create /var/lib/ceph/crash/posted] ************************** 2026-04-07 00:55:22.780827 | orchestrator | Tuesday 07 April 2026 00:53:16 +0000 (0:00:01.357) 0:08:13.089 ********* 2026-04-07 00:55:22.780834 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.780841 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.780848 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.780855 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.780862 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.780869 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.780876 | orchestrator | 2026-04-07 00:55:22.780884 | orchestrator | TASK [ceph-crash : Include_tasks systemd.yml] ********************************** 2026-04-07 00:55:22.780894 | orchestrator | Tuesday 07 April 2026 00:53:17 +0000 (0:00:01.159) 0:08:14.248 ********* 2026-04-07 00:55:22.780912 | orchestrator | included: /ansible/roles/ceph-crash/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.780920 | orchestrator | 2026-04-07 00:55:22.780927 | orchestrator | TASK [ceph-crash : Generate systemd unit file for ceph-crash container] ******** 2026-04-07 00:55:22.780933 | orchestrator | Tuesday 07 April 2026 00:53:19 +0000 (0:00:01.042) 0:08:15.291 ********* 2026-04-07 00:55:22.780940 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.780947 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.780953 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.780960 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.780967 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.780974 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.780981 | orchestrator | 2026-04-07 00:55:22.780988 | orchestrator | TASK [ceph-crash : Start the ceph-crash service] ******************************* 2026-04-07 00:55:22.780995 | orchestrator | Tuesday 07 April 2026 00:53:20 +0000 (0:00:01.415) 0:08:16.707 ********* 2026-04-07 00:55:22.781002 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.781009 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.781016 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.781023 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.781030 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.781037 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.781044 | orchestrator | 2026-04-07 00:55:22.781050 | orchestrator | RUNNING HANDLER [ceph-handler : Ceph crash handler] **************************** 2026-04-07 00:55:22.781057 | orchestrator | Tuesday 07 April 2026 00:53:23 +0000 (0:00:03.242) 0:08:19.949 ********* 2026-04-07 00:55:22.781075 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_crash.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:55:22.781083 | orchestrator | 2026-04-07 00:55:22.781090 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called before restart] ****** 2026-04-07 00:55:22.781097 | orchestrator | Tuesday 07 April 2026 00:53:24 +0000 (0:00:00.990) 0:08:20.939 ********* 2026-04-07 00:55:22.781104 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781118 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781125 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781132 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.781139 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.781146 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.781153 | orchestrator | 2026-04-07 00:55:22.781160 | orchestrator | RUNNING HANDLER [ceph-handler : Restart the ceph-crash service] **************** 2026-04-07 00:55:22.781167 | orchestrator | Tuesday 07 April 2026 00:53:25 +0000 (0:00:00.510) 0:08:21.450 ********* 2026-04-07 00:55:22.781174 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.781181 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.781188 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.781195 | orchestrator | changed: [testbed-node-1] 2026-04-07 00:55:22.781202 | orchestrator | changed: [testbed-node-2] 2026-04-07 00:55:22.781209 | orchestrator | changed: [testbed-node-0] 2026-04-07 00:55:22.781215 | orchestrator | 2026-04-07 00:55:22.781222 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called after restart] ******* 2026-04-07 00:55:22.781229 | orchestrator | Tuesday 07 April 2026 00:53:28 +0000 (0:00:03.113) 0:08:24.563 ********* 2026-04-07 00:55:22.781237 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781244 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781251 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781258 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:55:22.781264 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:55:22.781271 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:55:22.781278 | orchestrator | 2026-04-07 00:55:22.781285 | orchestrator | PLAY [Apply role ceph-mds] ***************************************************** 2026-04-07 00:55:22.781292 | orchestrator | 2026-04-07 00:55:22.781299 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.781306 | orchestrator | Tuesday 07 April 2026 00:53:29 +0000 (0:00:00.707) 0:08:25.271 ********* 2026-04-07 00:55:22.781313 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.781320 | orchestrator | 2026-04-07 00:55:22.781327 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.781334 | orchestrator | Tuesday 07 April 2026 00:53:29 +0000 (0:00:00.575) 0:08:25.847 ********* 2026-04-07 00:55:22.781341 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.781348 | orchestrator | 2026-04-07 00:55:22.781355 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.781362 | orchestrator | Tuesday 07 April 2026 00:53:30 +0000 (0:00:00.464) 0:08:26.311 ********* 2026-04-07 00:55:22.781368 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.781375 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.781382 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.781389 | orchestrator | 2026-04-07 00:55:22.781396 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.781404 | orchestrator | Tuesday 07 April 2026 00:53:30 +0000 (0:00:00.417) 0:08:26.729 ********* 2026-04-07 00:55:22.781411 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781417 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781424 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781431 | orchestrator | 2026-04-07 00:55:22.781438 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.781445 | orchestrator | Tuesday 07 April 2026 00:53:31 +0000 (0:00:00.621) 0:08:27.351 ********* 2026-04-07 00:55:22.781452 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781458 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781465 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781473 | orchestrator | 2026-04-07 00:55:22.781480 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.781487 | orchestrator | Tuesday 07 April 2026 00:53:31 +0000 (0:00:00.630) 0:08:27.981 ********* 2026-04-07 00:55:22.781499 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781510 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781517 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781523 | orchestrator | 2026-04-07 00:55:22.781530 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.781537 | orchestrator | Tuesday 07 April 2026 00:53:32 +0000 (0:00:00.604) 0:08:28.586 ********* 2026-04-07 00:55:22.781544 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.781551 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.781558 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.781566 | orchestrator | 2026-04-07 00:55:22.781573 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.781579 | orchestrator | Tuesday 07 April 2026 00:53:32 +0000 (0:00:00.459) 0:08:29.046 ********* 2026-04-07 00:55:22.781586 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.781593 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.781600 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.781607 | orchestrator | 2026-04-07 00:55:22.781614 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.781620 | orchestrator | Tuesday 07 April 2026 00:53:33 +0000 (0:00:00.274) 0:08:29.321 ********* 2026-04-07 00:55:22.781627 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.781634 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.781642 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.781649 | orchestrator | 2026-04-07 00:55:22.781656 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.781663 | orchestrator | Tuesday 07 April 2026 00:53:33 +0000 (0:00:00.271) 0:08:29.592 ********* 2026-04-07 00:55:22.781670 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781676 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781683 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781690 | orchestrator | 2026-04-07 00:55:22.781697 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.781704 | orchestrator | Tuesday 07 April 2026 00:53:33 +0000 (0:00:00.661) 0:08:30.254 ********* 2026-04-07 00:55:22.781714 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781721 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781728 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781735 | orchestrator | 2026-04-07 00:55:22.781742 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.781748 | orchestrator | Tuesday 07 April 2026 00:53:34 +0000 (0:00:00.731) 0:08:30.986 ********* 2026-04-07 00:55:22.781755 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.781762 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.781769 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.781776 | orchestrator | 2026-04-07 00:55:22.781782 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.781789 | orchestrator | Tuesday 07 April 2026 00:53:35 +0000 (0:00:00.443) 0:08:31.429 ********* 2026-04-07 00:55:22.781797 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.781804 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.781811 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.781818 | orchestrator | 2026-04-07 00:55:22.781824 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.781831 | orchestrator | Tuesday 07 April 2026 00:53:35 +0000 (0:00:00.239) 0:08:31.669 ********* 2026-04-07 00:55:22.781839 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781846 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781852 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781859 | orchestrator | 2026-04-07 00:55:22.781866 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.781873 | orchestrator | Tuesday 07 April 2026 00:53:35 +0000 (0:00:00.267) 0:08:31.937 ********* 2026-04-07 00:55:22.781880 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781892 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781925 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781933 | orchestrator | 2026-04-07 00:55:22.781939 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.781946 | orchestrator | Tuesday 07 April 2026 00:53:35 +0000 (0:00:00.263) 0:08:32.200 ********* 2026-04-07 00:55:22.781953 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.781960 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.781967 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.781975 | orchestrator | 2026-04-07 00:55:22.781982 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.781989 | orchestrator | Tuesday 07 April 2026 00:53:36 +0000 (0:00:00.474) 0:08:32.675 ********* 2026-04-07 00:55:22.781996 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.782003 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.782010 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.782036 | orchestrator | 2026-04-07 00:55:22.782043 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.782051 | orchestrator | Tuesday 07 April 2026 00:53:36 +0000 (0:00:00.247) 0:08:32.922 ********* 2026-04-07 00:55:22.782058 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.782065 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.782072 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.782079 | orchestrator | 2026-04-07 00:55:22.782086 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.782093 | orchestrator | Tuesday 07 April 2026 00:53:36 +0000 (0:00:00.257) 0:08:33.179 ********* 2026-04-07 00:55:22.782100 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.782106 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.782113 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.782120 | orchestrator | 2026-04-07 00:55:22.782127 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.782133 | orchestrator | Tuesday 07 April 2026 00:53:37 +0000 (0:00:00.241) 0:08:33.421 ********* 2026-04-07 00:55:22.782140 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.782147 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.782153 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.782160 | orchestrator | 2026-04-07 00:55:22.782166 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.782173 | orchestrator | Tuesday 07 April 2026 00:53:37 +0000 (0:00:00.467) 0:08:33.888 ********* 2026-04-07 00:55:22.782180 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.782187 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.782194 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.782201 | orchestrator | 2026-04-07 00:55:22.782212 | orchestrator | TASK [ceph-mds : Include create_mds_filesystems.yml] *************************** 2026-04-07 00:55:22.782220 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.471) 0:08:34.359 ********* 2026-04-07 00:55:22.782226 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.782234 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.782241 | orchestrator | included: /ansible/roles/ceph-mds/tasks/create_mds_filesystems.yml for testbed-node-3 2026-04-07 00:55:22.782248 | orchestrator | 2026-04-07 00:55:22.782255 | orchestrator | TASK [ceph-facts : Get current default crush rule details] ********************* 2026-04-07 00:55:22.782262 | orchestrator | Tuesday 07 April 2026 00:53:38 +0000 (0:00:00.480) 0:08:34.840 ********* 2026-04-07 00:55:22.782268 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.782275 | orchestrator | 2026-04-07 00:55:22.782282 | orchestrator | TASK [ceph-facts : Get current default crush rule name] ************************ 2026-04-07 00:55:22.782289 | orchestrator | Tuesday 07 April 2026 00:53:40 +0000 (0:00:02.222) 0:08:37.063 ********* 2026-04-07 00:55:22.782296 | orchestrator | skipping: [testbed-node-3] => (item={'rule_id': 0, 'rule_name': 'replicated_rule', 'type': 1, 'steps': [{'op': 'take', 'item': -1, 'item_name': 'default'}, {'op': 'chooseleaf_firstn', 'num': 0, 'type': 'host'}, {'op': 'emit'}]})  2026-04-07 00:55:22.782307 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.782314 | orchestrator | 2026-04-07 00:55:22.782320 | orchestrator | TASK [ceph-mds : Create filesystem pools] ************************************** 2026-04-07 00:55:22.782327 | orchestrator | Tuesday 07 April 2026 00:53:40 +0000 (0:00:00.198) 0:08:37.261 ********* 2026-04-07 00:55:22.782337 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_data', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:55:22.782348 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_metadata', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:55:22.782355 | orchestrator | 2026-04-07 00:55:22.782361 | orchestrator | TASK [ceph-mds : Create ceph filesystem] *************************************** 2026-04-07 00:55:22.782367 | orchestrator | Tuesday 07 April 2026 00:53:48 +0000 (0:00:07.038) 0:08:44.299 ********* 2026-04-07 00:55:22.782373 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-07 00:55:22.782379 | orchestrator | 2026-04-07 00:55:22.782386 | orchestrator | TASK [ceph-mds : Include common.yml] ******************************************* 2026-04-07 00:55:22.782392 | orchestrator | Tuesday 07 April 2026 00:53:52 +0000 (0:00:04.041) 0:08:48.341 ********* 2026-04-07 00:55:22.782399 | orchestrator | included: /ansible/roles/ceph-mds/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.782405 | orchestrator | 2026-04-07 00:55:22.782412 | orchestrator | TASK [ceph-mds : Create bootstrap-mds and mds directories] ********************* 2026-04-07 00:55:22.782418 | orchestrator | Tuesday 07 April 2026 00:53:52 +0000 (0:00:00.456) 0:08:48.798 ********* 2026-04-07 00:55:22.782424 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds/) 2026-04-07 00:55:22.782431 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds/) 2026-04-07 00:55:22.782437 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds/ceph-testbed-node-3) 2026-04-07 00:55:22.782443 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds/) 2026-04-07 00:55:22.782449 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds/ceph-testbed-node-4) 2026-04-07 00:55:22.782456 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds/ceph-testbed-node-5) 2026-04-07 00:55:22.782462 | orchestrator | 2026-04-07 00:55:22.782468 | orchestrator | TASK [ceph-mds : Get keys from monitors] *************************************** 2026-04-07 00:55:22.782474 | orchestrator | Tuesday 07 April 2026 00:53:53 +0000 (0:00:01.117) 0:08:49.915 ********* 2026-04-07 00:55:22.782481 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.782487 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-07 00:55:22.782494 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:55:22.782501 | orchestrator | 2026-04-07 00:55:22.782507 | orchestrator | TASK [ceph-mds : Copy ceph key(s) if needed] *********************************** 2026-04-07 00:55:22.782513 | orchestrator | Tuesday 07 April 2026 00:53:55 +0000 (0:00:01.899) 0:08:51.815 ********* 2026-04-07 00:55:22.782519 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-07 00:55:22.782526 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-07 00:55:22.782532 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782538 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-07 00:55:22.782545 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-04-07 00:55:22.782551 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782557 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-07 00:55:22.782564 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-04-07 00:55:22.782570 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782580 | orchestrator | 2026-04-07 00:55:22.782587 | orchestrator | TASK [ceph-mds : Create mds keyring] ******************************************* 2026-04-07 00:55:22.782593 | orchestrator | Tuesday 07 April 2026 00:53:56 +0000 (0:00:01.065) 0:08:52.880 ********* 2026-04-07 00:55:22.782599 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782605 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782612 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782618 | orchestrator | 2026-04-07 00:55:22.782627 | orchestrator | TASK [ceph-mds : Non_containerized.yml] **************************************** 2026-04-07 00:55:22.782634 | orchestrator | Tuesday 07 April 2026 00:53:59 +0000 (0:00:02.425) 0:08:55.305 ********* 2026-04-07 00:55:22.782640 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.782647 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.782653 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.782660 | orchestrator | 2026-04-07 00:55:22.782666 | orchestrator | TASK [ceph-mds : Containerized.yml] ******************************************** 2026-04-07 00:55:22.782672 | orchestrator | Tuesday 07 April 2026 00:53:59 +0000 (0:00:00.541) 0:08:55.847 ********* 2026-04-07 00:55:22.782679 | orchestrator | included: /ansible/roles/ceph-mds/tasks/containerized.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.782685 | orchestrator | 2026-04-07 00:55:22.782691 | orchestrator | TASK [ceph-mds : Include_tasks systemd.yml] ************************************ 2026-04-07 00:55:22.782697 | orchestrator | Tuesday 07 April 2026 00:54:00 +0000 (0:00:00.562) 0:08:56.410 ********* 2026-04-07 00:55:22.782704 | orchestrator | included: /ansible/roles/ceph-mds/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.782710 | orchestrator | 2026-04-07 00:55:22.782716 | orchestrator | TASK [ceph-mds : Generate systemd unit file] *********************************** 2026-04-07 00:55:22.782723 | orchestrator | Tuesday 07 April 2026 00:54:00 +0000 (0:00:00.784) 0:08:57.194 ********* 2026-04-07 00:55:22.782729 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782736 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782742 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782749 | orchestrator | 2026-04-07 00:55:22.782755 | orchestrator | TASK [ceph-mds : Generate systemd ceph-mds target file] ************************ 2026-04-07 00:55:22.782761 | orchestrator | Tuesday 07 April 2026 00:54:02 +0000 (0:00:01.122) 0:08:58.316 ********* 2026-04-07 00:55:22.782768 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782777 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782783 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782789 | orchestrator | 2026-04-07 00:55:22.782796 | orchestrator | TASK [ceph-mds : Enable ceph-mds.target] *************************************** 2026-04-07 00:55:22.782802 | orchestrator | Tuesday 07 April 2026 00:54:03 +0000 (0:00:01.033) 0:08:59.349 ********* 2026-04-07 00:55:22.782808 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782815 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782821 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782828 | orchestrator | 2026-04-07 00:55:22.782834 | orchestrator | TASK [ceph-mds : Systemd start mds container] ********************************** 2026-04-07 00:55:22.782840 | orchestrator | Tuesday 07 April 2026 00:54:04 +0000 (0:00:01.664) 0:09:01.014 ********* 2026-04-07 00:55:22.782847 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782853 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782859 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782865 | orchestrator | 2026-04-07 00:55:22.782872 | orchestrator | TASK [ceph-mds : Wait for mds socket to exist] ********************************* 2026-04-07 00:55:22.782878 | orchestrator | Tuesday 07 April 2026 00:54:06 +0000 (0:00:01.844) 0:09:02.859 ********* 2026-04-07 00:55:22.782884 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.782890 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.782906 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.782913 | orchestrator | 2026-04-07 00:55:22.782920 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-07 00:55:22.782926 | orchestrator | Tuesday 07 April 2026 00:54:07 +0000 (0:00:01.174) 0:09:04.034 ********* 2026-04-07 00:55:22.782936 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.782942 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.782948 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.782955 | orchestrator | 2026-04-07 00:55:22.782962 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2026-04-07 00:55:22.782968 | orchestrator | Tuesday 07 April 2026 00:54:08 +0000 (0:00:00.810) 0:09:04.844 ********* 2026-04-07 00:55:22.782974 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.782980 | orchestrator | 2026-04-07 00:55:22.782987 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2026-04-07 00:55:22.782993 | orchestrator | Tuesday 07 April 2026 00:54:09 +0000 (0:00:00.438) 0:09:05.283 ********* 2026-04-07 00:55:22.783000 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783006 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783013 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783019 | orchestrator | 2026-04-07 00:55:22.783025 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2026-04-07 00:55:22.783031 | orchestrator | Tuesday 07 April 2026 00:54:09 +0000 (0:00:00.271) 0:09:05.554 ********* 2026-04-07 00:55:22.783037 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.783043 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.783050 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.783056 | orchestrator | 2026-04-07 00:55:22.783062 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2026-04-07 00:55:22.783069 | orchestrator | Tuesday 07 April 2026 00:54:10 +0000 (0:00:01.336) 0:09:06.890 ********* 2026-04-07 00:55:22.783075 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.783082 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.783088 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.783094 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783100 | orchestrator | 2026-04-07 00:55:22.783107 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2026-04-07 00:55:22.783113 | orchestrator | Tuesday 07 April 2026 00:54:11 +0000 (0:00:00.541) 0:09:07.432 ********* 2026-04-07 00:55:22.783119 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783125 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783132 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783138 | orchestrator | 2026-04-07 00:55:22.783144 | orchestrator | PLAY [Apply role ceph-rgw] ***************************************************** 2026-04-07 00:55:22.783150 | orchestrator | 2026-04-07 00:55:22.783157 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-07 00:55:22.783167 | orchestrator | Tuesday 07 April 2026 00:54:11 +0000 (0:00:00.482) 0:09:07.915 ********* 2026-04-07 00:55:22.783174 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.783180 | orchestrator | 2026-04-07 00:55:22.783187 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-07 00:55:22.783193 | orchestrator | Tuesday 07 April 2026 00:54:12 +0000 (0:00:00.711) 0:09:08.627 ********* 2026-04-07 00:55:22.783199 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.783206 | orchestrator | 2026-04-07 00:55:22.783212 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-07 00:55:22.783218 | orchestrator | Tuesday 07 April 2026 00:54:12 +0000 (0:00:00.498) 0:09:09.125 ********* 2026-04-07 00:55:22.783225 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783231 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783237 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783244 | orchestrator | 2026-04-07 00:55:22.783250 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-07 00:55:22.783261 | orchestrator | Tuesday 07 April 2026 00:54:13 +0000 (0:00:00.306) 0:09:09.432 ********* 2026-04-07 00:55:22.783267 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783273 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783280 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783286 | orchestrator | 2026-04-07 00:55:22.783292 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-07 00:55:22.783298 | orchestrator | Tuesday 07 April 2026 00:54:14 +0000 (0:00:00.986) 0:09:10.419 ********* 2026-04-07 00:55:22.783305 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783311 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783317 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783323 | orchestrator | 2026-04-07 00:55:22.783332 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-07 00:55:22.783338 | orchestrator | Tuesday 07 April 2026 00:54:14 +0000 (0:00:00.729) 0:09:11.149 ********* 2026-04-07 00:55:22.783345 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783351 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783358 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783364 | orchestrator | 2026-04-07 00:55:22.783370 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-07 00:55:22.783376 | orchestrator | Tuesday 07 April 2026 00:54:15 +0000 (0:00:00.710) 0:09:11.860 ********* 2026-04-07 00:55:22.783383 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783389 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783395 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783401 | orchestrator | 2026-04-07 00:55:22.783408 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-07 00:55:22.783414 | orchestrator | Tuesday 07 April 2026 00:54:15 +0000 (0:00:00.296) 0:09:12.156 ********* 2026-04-07 00:55:22.783420 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783426 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783432 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783439 | orchestrator | 2026-04-07 00:55:22.783445 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-07 00:55:22.783451 | orchestrator | Tuesday 07 April 2026 00:54:16 +0000 (0:00:00.531) 0:09:12.687 ********* 2026-04-07 00:55:22.783457 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783463 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783470 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783476 | orchestrator | 2026-04-07 00:55:22.783482 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-07 00:55:22.783488 | orchestrator | Tuesday 07 April 2026 00:54:16 +0000 (0:00:00.299) 0:09:12.987 ********* 2026-04-07 00:55:22.783495 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783501 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783508 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783514 | orchestrator | 2026-04-07 00:55:22.783520 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-07 00:55:22.783526 | orchestrator | Tuesday 07 April 2026 00:54:17 +0000 (0:00:00.742) 0:09:13.729 ********* 2026-04-07 00:55:22.783533 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783539 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783545 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783551 | orchestrator | 2026-04-07 00:55:22.783557 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-07 00:55:22.783564 | orchestrator | Tuesday 07 April 2026 00:54:18 +0000 (0:00:00.748) 0:09:14.478 ********* 2026-04-07 00:55:22.783570 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783576 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783582 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783588 | orchestrator | 2026-04-07 00:55:22.783594 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-07 00:55:22.783601 | orchestrator | Tuesday 07 April 2026 00:54:18 +0000 (0:00:00.515) 0:09:14.993 ********* 2026-04-07 00:55:22.783611 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783617 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783623 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783629 | orchestrator | 2026-04-07 00:55:22.783635 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-07 00:55:22.783641 | orchestrator | Tuesday 07 April 2026 00:54:19 +0000 (0:00:00.303) 0:09:15.297 ********* 2026-04-07 00:55:22.783648 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783654 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783660 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783666 | orchestrator | 2026-04-07 00:55:22.783673 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-07 00:55:22.783679 | orchestrator | Tuesday 07 April 2026 00:54:19 +0000 (0:00:00.325) 0:09:15.622 ********* 2026-04-07 00:55:22.783684 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783688 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783693 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783699 | orchestrator | 2026-04-07 00:55:22.783706 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-07 00:55:22.783715 | orchestrator | Tuesday 07 April 2026 00:54:19 +0000 (0:00:00.321) 0:09:15.943 ********* 2026-04-07 00:55:22.783722 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783728 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783735 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783741 | orchestrator | 2026-04-07 00:55:22.783748 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-07 00:55:22.783752 | orchestrator | Tuesday 07 April 2026 00:54:20 +0000 (0:00:00.564) 0:09:16.508 ********* 2026-04-07 00:55:22.783756 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783760 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783764 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783768 | orchestrator | 2026-04-07 00:55:22.783771 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-07 00:55:22.783775 | orchestrator | Tuesday 07 April 2026 00:54:20 +0000 (0:00:00.317) 0:09:16.826 ********* 2026-04-07 00:55:22.783779 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783783 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783787 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783790 | orchestrator | 2026-04-07 00:55:22.783794 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-07 00:55:22.783798 | orchestrator | Tuesday 07 April 2026 00:54:20 +0000 (0:00:00.302) 0:09:17.128 ********* 2026-04-07 00:55:22.783802 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783806 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783809 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783813 | orchestrator | 2026-04-07 00:55:22.783817 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-07 00:55:22.783821 | orchestrator | Tuesday 07 April 2026 00:54:21 +0000 (0:00:00.296) 0:09:17.424 ********* 2026-04-07 00:55:22.783825 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783829 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783832 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783836 | orchestrator | 2026-04-07 00:55:22.783843 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-07 00:55:22.783847 | orchestrator | Tuesday 07 April 2026 00:54:21 +0000 (0:00:00.553) 0:09:17.978 ********* 2026-04-07 00:55:22.783851 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.783854 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.783858 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.783862 | orchestrator | 2026-04-07 00:55:22.783866 | orchestrator | TASK [ceph-rgw : Include common.yml] ******************************************* 2026-04-07 00:55:22.783870 | orchestrator | Tuesday 07 April 2026 00:54:22 +0000 (0:00:00.557) 0:09:18.535 ********* 2026-04-07 00:55:22.783874 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.783880 | orchestrator | 2026-04-07 00:55:22.783884 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2026-04-07 00:55:22.783888 | orchestrator | Tuesday 07 April 2026 00:54:23 +0000 (0:00:00.747) 0:09:19.282 ********* 2026-04-07 00:55:22.783891 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.783895 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-07 00:55:22.783909 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:55:22.783914 | orchestrator | 2026-04-07 00:55:22.783918 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2026-04-07 00:55:22.783922 | orchestrator | Tuesday 07 April 2026 00:54:25 +0000 (0:00:02.242) 0:09:21.525 ********* 2026-04-07 00:55:22.783926 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-07 00:55:22.783929 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-07 00:55:22.783933 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-04-07 00:55:22.783937 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-07 00:55:22.783941 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.783945 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.783948 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-07 00:55:22.783952 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-04-07 00:55:22.783956 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.783960 | orchestrator | 2026-04-07 00:55:22.783963 | orchestrator | TASK [ceph-rgw : Copy SSL certificate & key data to certificate path] ********** 2026-04-07 00:55:22.783967 | orchestrator | Tuesday 07 April 2026 00:54:26 +0000 (0:00:01.230) 0:09:22.756 ********* 2026-04-07 00:55:22.783971 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.783975 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.783979 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.783982 | orchestrator | 2026-04-07 00:55:22.783986 | orchestrator | TASK [ceph-rgw : Include_tasks pre_requisite.yml] ****************************** 2026-04-07 00:55:22.783990 | orchestrator | Tuesday 07 April 2026 00:54:26 +0000 (0:00:00.297) 0:09:23.053 ********* 2026-04-07 00:55:22.783994 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/pre_requisite.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.783998 | orchestrator | 2026-04-07 00:55:22.784001 | orchestrator | TASK [ceph-rgw : Create rados gateway directories] ***************************** 2026-04-07 00:55:22.784005 | orchestrator | Tuesday 07 April 2026 00:54:27 +0000 (0:00:00.681) 0:09:23.735 ********* 2026-04-07 00:55:22.784009 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.784013 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.784017 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.784021 | orchestrator | 2026-04-07 00:55:22.784024 | orchestrator | TASK [ceph-rgw : Create rgw keyrings] ****************************************** 2026-04-07 00:55:22.784031 | orchestrator | Tuesday 07 April 2026 00:54:28 +0000 (0:00:00.858) 0:09:24.594 ********* 2026-04-07 00:55:22.784035 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.784039 | orchestrator | changed: [testbed-node-3 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-04-07 00:55:22.784042 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.784046 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-04-07 00:55:22.784050 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.784056 | orchestrator | changed: [testbed-node-4 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-04-07 00:55:22.784060 | orchestrator | 2026-04-07 00:55:22.784064 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2026-04-07 00:55:22.784068 | orchestrator | Tuesday 07 April 2026 00:54:32 +0000 (0:00:04.374) 0:09:28.968 ********* 2026-04-07 00:55:22.784072 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.784076 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:55:22.784079 | orchestrator | ok: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.784083 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:55:22.784089 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:55:22.784096 | orchestrator | ok: [testbed-node-4 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:55:22.784102 | orchestrator | 2026-04-07 00:55:22.784108 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2026-04-07 00:55:22.784115 | orchestrator | Tuesday 07 April 2026 00:54:35 +0000 (0:00:02.334) 0:09:31.303 ********* 2026-04-07 00:55:22.784121 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-07 00:55:22.784128 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.784135 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-07 00:55:22.784141 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.784147 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-07 00:55:22.784153 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.784157 | orchestrator | 2026-04-07 00:55:22.784161 | orchestrator | TASK [ceph-rgw : Rgw pool creation tasks] ************************************** 2026-04-07 00:55:22.784165 | orchestrator | Tuesday 07 April 2026 00:54:36 +0000 (0:00:01.561) 0:09:32.864 ********* 2026-04-07 00:55:22.784169 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/rgw_create_pools.yml for testbed-node-3 2026-04-07 00:55:22.784173 | orchestrator | 2026-04-07 00:55:22.784177 | orchestrator | TASK [ceph-rgw : Create ec profile] ******************************************** 2026-04-07 00:55:22.784181 | orchestrator | Tuesday 07 April 2026 00:54:36 +0000 (0:00:00.238) 0:09:33.103 ********* 2026-04-07 00:55:22.784184 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784189 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784193 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784196 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784200 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784204 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784208 | orchestrator | 2026-04-07 00:55:22.784212 | orchestrator | TASK [ceph-rgw : Set crush rule] *********************************************** 2026-04-07 00:55:22.784216 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.539) 0:09:33.642 ********* 2026-04-07 00:55:22.784220 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784224 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784227 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784231 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784239 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-07 00:55:22.784244 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784247 | orchestrator | 2026-04-07 00:55:22.784251 | orchestrator | TASK [ceph-rgw : Create rgw pools] ********************************************* 2026-04-07 00:55:22.784255 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.552) 0:09:34.195 ********* 2026-04-07 00:55:22.784262 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-07 00:55:22.784282 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-07 00:55:22.784286 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-07 00:55:22.784290 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-07 00:55:22.784294 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-07 00:55:22.784298 | orchestrator | 2026-04-07 00:55:22.784301 | orchestrator | TASK [ceph-rgw : Include_tasks openstack-keystone.yml] ************************* 2026-04-07 00:55:22.784305 | orchestrator | Tuesday 07 April 2026 00:55:08 +0000 (0:00:30.782) 0:10:04.978 ********* 2026-04-07 00:55:22.784309 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784313 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.784317 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.784321 | orchestrator | 2026-04-07 00:55:22.784325 | orchestrator | TASK [ceph-rgw : Include_tasks start_radosgw.yml] ****************************** 2026-04-07 00:55:22.784329 | orchestrator | Tuesday 07 April 2026 00:55:08 +0000 (0:00:00.288) 0:10:05.266 ********* 2026-04-07 00:55:22.784332 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784336 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.784340 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.784344 | orchestrator | 2026-04-07 00:55:22.784350 | orchestrator | TASK [ceph-rgw : Include start_docker_rgw.yml] ********************************* 2026-04-07 00:55:22.784354 | orchestrator | Tuesday 07 April 2026 00:55:09 +0000 (0:00:00.515) 0:10:05.781 ********* 2026-04-07 00:55:22.784358 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/start_docker_rgw.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.784362 | orchestrator | 2026-04-07 00:55:22.784365 | orchestrator | TASK [ceph-rgw : Include_task systemd.yml] ************************************* 2026-04-07 00:55:22.784369 | orchestrator | Tuesday 07 April 2026 00:55:10 +0000 (0:00:00.511) 0:10:06.293 ********* 2026-04-07 00:55:22.784373 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.784377 | orchestrator | 2026-04-07 00:55:22.784381 | orchestrator | TASK [ceph-rgw : Generate systemd unit file] *********************************** 2026-04-07 00:55:22.784385 | orchestrator | Tuesday 07 April 2026 00:55:10 +0000 (0:00:00.685) 0:10:06.978 ********* 2026-04-07 00:55:22.784388 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.784392 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.784396 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.784400 | orchestrator | 2026-04-07 00:55:22.784404 | orchestrator | TASK [ceph-rgw : Generate systemd ceph-radosgw target file] ******************** 2026-04-07 00:55:22.784408 | orchestrator | Tuesday 07 April 2026 00:55:12 +0000 (0:00:01.294) 0:10:08.273 ********* 2026-04-07 00:55:22.784412 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.784415 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.784421 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.784425 | orchestrator | 2026-04-07 00:55:22.784429 | orchestrator | TASK [ceph-rgw : Enable ceph-radosgw.target] *********************************** 2026-04-07 00:55:22.784433 | orchestrator | Tuesday 07 April 2026 00:55:13 +0000 (0:00:01.178) 0:10:09.452 ********* 2026-04-07 00:55:22.784437 | orchestrator | changed: [testbed-node-4] 2026-04-07 00:55:22.784443 | orchestrator | changed: [testbed-node-3] 2026-04-07 00:55:22.784451 | orchestrator | changed: [testbed-node-5] 2026-04-07 00:55:22.784461 | orchestrator | 2026-04-07 00:55:22.784467 | orchestrator | TASK [ceph-rgw : Systemd start rgw container] ********************************** 2026-04-07 00:55:22.784472 | orchestrator | Tuesday 07 April 2026 00:55:15 +0000 (0:00:01.969) 0:10:11.422 ********* 2026-04-07 00:55:22.784478 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.784484 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.784490 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-07 00:55:22.784495 | orchestrator | 2026-04-07 00:55:22.784501 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-07 00:55:22.784507 | orchestrator | Tuesday 07 April 2026 00:55:17 +0000 (0:00:02.822) 0:10:14.244 ********* 2026-04-07 00:55:22.784513 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784520 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.784526 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.784532 | orchestrator | 2026-04-07 00:55:22.784539 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2026-04-07 00:55:22.784543 | orchestrator | Tuesday 07 April 2026 00:55:18 +0000 (0:00:00.324) 0:10:14.569 ********* 2026-04-07 00:55:22.784547 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:55:22.784551 | orchestrator | 2026-04-07 00:55:22.784555 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2026-04-07 00:55:22.784559 | orchestrator | Tuesday 07 April 2026 00:55:19 +0000 (0:00:00.748) 0:10:15.317 ********* 2026-04-07 00:55:22.784562 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.784566 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.784570 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.784574 | orchestrator | 2026-04-07 00:55:22.784582 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2026-04-07 00:55:22.784586 | orchestrator | Tuesday 07 April 2026 00:55:19 +0000 (0:00:00.325) 0:10:15.642 ********* 2026-04-07 00:55:22.784590 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784594 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:55:22.784597 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:55:22.784601 | orchestrator | 2026-04-07 00:55:22.784605 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2026-04-07 00:55:22.784609 | orchestrator | Tuesday 07 April 2026 00:55:19 +0000 (0:00:00.327) 0:10:15.969 ********* 2026-04-07 00:55:22.784613 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:55:22.784617 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:55:22.784620 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:55:22.784624 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:55:22.784628 | orchestrator | 2026-04-07 00:55:22.784632 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2026-04-07 00:55:22.784636 | orchestrator | Tuesday 07 April 2026 00:55:20 +0000 (0:00:00.829) 0:10:16.799 ********* 2026-04-07 00:55:22.784640 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:55:22.784643 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:55:22.784647 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:55:22.784651 | orchestrator | 2026-04-07 00:55:22.784658 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:55:22.784662 | orchestrator | testbed-node-0 : ok=134  changed=35  unreachable=0 failed=0 skipped=125  rescued=0 ignored=0 2026-04-07 00:55:22.784666 | orchestrator | testbed-node-1 : ok=127  changed=31  unreachable=0 failed=0 skipped=120  rescued=0 ignored=0 2026-04-07 00:55:22.784673 | orchestrator | testbed-node-2 : ok=134  changed=33  unreachable=0 failed=0 skipped=119  rescued=0 ignored=0 2026-04-07 00:55:22.784677 | orchestrator | testbed-node-3 : ok=193  changed=45  unreachable=0 failed=0 skipped=162  rescued=0 ignored=0 2026-04-07 00:55:22.784681 | orchestrator | testbed-node-4 : ok=175  changed=40  unreachable=0 failed=0 skipped=123  rescued=0 ignored=0 2026-04-07 00:55:22.784685 | orchestrator | testbed-node-5 : ok=177  changed=41  unreachable=0 failed=0 skipped=121  rescued=0 ignored=0 2026-04-07 00:55:22.784688 | orchestrator | 2026-04-07 00:55:22.784692 | orchestrator | 2026-04-07 00:55:22.784696 | orchestrator | 2026-04-07 00:55:22.784700 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:55:22.784704 | orchestrator | Tuesday 07 April 2026 00:55:21 +0000 (0:00:00.474) 0:10:17.273 ********* 2026-04-07 00:55:22.784708 | orchestrator | =============================================================================== 2026-04-07 00:55:22.784711 | orchestrator | ceph-osd : Use ceph-volume to create osds ------------------------------ 43.42s 2026-04-07 00:55:22.784715 | orchestrator | ceph-container-common : Pulling Ceph container image ------------------- 42.98s 2026-04-07 00:55:22.784719 | orchestrator | ceph-mgr : Wait for all mgr to be up ----------------------------------- 36.69s 2026-04-07 00:55:22.784723 | orchestrator | ceph-rgw : Create rgw pools -------------------------------------------- 30.78s 2026-04-07 00:55:22.784727 | orchestrator | ceph-mon : Set cluster configs ----------------------------------------- 15.88s 2026-04-07 00:55:22.784730 | orchestrator | ceph-osd : Wait for all osd to be up ----------------------------------- 12.75s 2026-04-07 00:55:22.784734 | orchestrator | ceph-mgr : Create ceph mgr keyring(s) on a mon node -------------------- 11.12s 2026-04-07 00:55:22.784738 | orchestrator | ceph-mon : Fetch ceph initial keys -------------------------------------- 9.71s 2026-04-07 00:55:22.784742 | orchestrator | ceph-mds : Create filesystem pools -------------------------------------- 7.04s 2026-04-07 00:55:22.784746 | orchestrator | ceph-config : Create ceph initial directories --------------------------- 6.96s 2026-04-07 00:55:22.784749 | orchestrator | ceph-mgr : Disable ceph mgr enabled modules ----------------------------- 6.33s 2026-04-07 00:55:22.784753 | orchestrator | ceph-mgr : Add modules to ceph-mgr -------------------------------------- 4.64s 2026-04-07 00:55:22.784757 | orchestrator | ceph-facts : Find a running mon container ------------------------------- 4.56s 2026-04-07 00:55:22.784761 | orchestrator | ceph-rgw : Create rgw keyrings ------------------------------------------ 4.37s 2026-04-07 00:55:22.784764 | orchestrator | ceph-mds : Create ceph filesystem --------------------------------------- 4.04s 2026-04-07 00:55:22.784768 | orchestrator | ceph-crash : Create client.crash keyring -------------------------------- 4.01s 2026-04-07 00:55:22.784772 | orchestrator | ceph-mon : Copy admin keyring over to mons ------------------------------ 3.74s 2026-04-07 00:55:22.784776 | orchestrator | ceph-container-common : Get ceph version -------------------------------- 3.61s 2026-04-07 00:55:22.784779 | orchestrator | ceph-osd : Systemd start osd -------------------------------------------- 3.59s 2026-04-07 00:55:22.784783 | orchestrator | ceph-crash : Start the ceph-crash service ------------------------------- 3.24s 2026-04-07 00:55:25.801998 | orchestrator | 2026-04-07 00:55:25 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:25.803647 | orchestrator | 2026-04-07 00:55:25 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:25.805320 | orchestrator | 2026-04-07 00:55:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:25.807748 | orchestrator | 2026-04-07 00:55:25 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:25.807797 | orchestrator | 2026-04-07 00:55:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:28.852443 | orchestrator | 2026-04-07 00:55:28 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:28.854204 | orchestrator | 2026-04-07 00:55:28 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:28.855987 | orchestrator | 2026-04-07 00:55:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:28.857829 | orchestrator | 2026-04-07 00:55:28 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:28.857868 | orchestrator | 2026-04-07 00:55:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:31.909175 | orchestrator | 2026-04-07 00:55:31 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:31.910521 | orchestrator | 2026-04-07 00:55:31 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state STARTED 2026-04-07 00:55:31.912160 | orchestrator | 2026-04-07 00:55:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:31.913434 | orchestrator | 2026-04-07 00:55:31 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state STARTED 2026-04-07 00:55:31.913494 | orchestrator | 2026-04-07 00:55:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:34.959027 | orchestrator | 2026-04-07 00:55:34 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:34.960029 | orchestrator | 2026-04-07 00:55:34 | INFO  | Task d86b1f0e-5e7f-4981-aace-0eeea48abaaa is in state SUCCESS 2026-04-07 00:55:34.962457 | orchestrator | 2026-04-07 00:55:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:34.963244 | orchestrator | 2026-04-07 00:55:34 | INFO  | Task 30a1b2d4-8014-44b1-af22-fdb68038183a is in state SUCCESS 2026-04-07 00:55:34.963325 | orchestrator | 2026-04-07 00:55:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:38.010700 | orchestrator | 2026-04-07 00:55:38 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:38.011636 | orchestrator | 2026-04-07 00:55:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:38.011706 | orchestrator | 2026-04-07 00:55:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:41.053237 | orchestrator | 2026-04-07 00:55:41 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:41.053908 | orchestrator | 2026-04-07 00:55:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:41.053996 | orchestrator | 2026-04-07 00:55:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:44.102783 | orchestrator | 2026-04-07 00:55:44 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:44.105053 | orchestrator | 2026-04-07 00:55:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:44.105111 | orchestrator | 2026-04-07 00:55:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:47.144715 | orchestrator | 2026-04-07 00:55:47 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:47.145090 | orchestrator | 2026-04-07 00:55:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:47.145162 | orchestrator | 2026-04-07 00:55:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:50.198300 | orchestrator | 2026-04-07 00:55:50 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:50.200757 | orchestrator | 2026-04-07 00:55:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:50.200811 | orchestrator | 2026-04-07 00:55:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:53.252509 | orchestrator | 2026-04-07 00:55:53 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:53.254131 | orchestrator | 2026-04-07 00:55:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:53.254371 | orchestrator | 2026-04-07 00:55:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:56.302802 | orchestrator | 2026-04-07 00:55:56 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:56.305245 | orchestrator | 2026-04-07 00:55:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:56.305314 | orchestrator | 2026-04-07 00:55:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:55:59.343318 | orchestrator | 2026-04-07 00:55:59 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:55:59.345246 | orchestrator | 2026-04-07 00:55:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:55:59.345294 | orchestrator | 2026-04-07 00:55:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:02.379155 | orchestrator | 2026-04-07 00:56:02 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:02.380768 | orchestrator | 2026-04-07 00:56:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:02.380843 | orchestrator | 2026-04-07 00:56:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:05.424523 | orchestrator | 2026-04-07 00:56:05 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:05.426419 | orchestrator | 2026-04-07 00:56:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:05.426565 | orchestrator | 2026-04-07 00:56:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:08.466945 | orchestrator | 2026-04-07 00:56:08 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:08.469321 | orchestrator | 2026-04-07 00:56:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:08.469384 | orchestrator | 2026-04-07 00:56:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:11.515539 | orchestrator | 2026-04-07 00:56:11 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:11.518933 | orchestrator | 2026-04-07 00:56:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:11.519039 | orchestrator | 2026-04-07 00:56:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:14.561146 | orchestrator | 2026-04-07 00:56:14 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:14.563561 | orchestrator | 2026-04-07 00:56:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:14.563866 | orchestrator | 2026-04-07 00:56:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:17.603930 | orchestrator | 2026-04-07 00:56:17 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:17.605429 | orchestrator | 2026-04-07 00:56:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:17.605503 | orchestrator | 2026-04-07 00:56:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:20.651184 | orchestrator | 2026-04-07 00:56:20 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:20.653028 | orchestrator | 2026-04-07 00:56:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:20.653155 | orchestrator | 2026-04-07 00:56:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:23.700177 | orchestrator | 2026-04-07 00:56:23 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:23.702093 | orchestrator | 2026-04-07 00:56:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:23.702170 | orchestrator | 2026-04-07 00:56:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:26.744331 | orchestrator | 2026-04-07 00:56:26 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:26.746415 | orchestrator | 2026-04-07 00:56:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:26.746480 | orchestrator | 2026-04-07 00:56:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:29.785957 | orchestrator | 2026-04-07 00:56:29 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:29.787941 | orchestrator | 2026-04-07 00:56:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:29.788050 | orchestrator | 2026-04-07 00:56:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:32.829060 | orchestrator | 2026-04-07 00:56:32 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:32.830353 | orchestrator | 2026-04-07 00:56:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:32.830428 | orchestrator | 2026-04-07 00:56:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:35.870994 | orchestrator | 2026-04-07 00:56:35 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:35.871647 | orchestrator | 2026-04-07 00:56:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:35.871730 | orchestrator | 2026-04-07 00:56:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:38.911524 | orchestrator | 2026-04-07 00:56:38 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:38.912680 | orchestrator | 2026-04-07 00:56:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:38.912722 | orchestrator | 2026-04-07 00:56:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:41.961699 | orchestrator | 2026-04-07 00:56:41 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:41.963617 | orchestrator | 2026-04-07 00:56:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:41.963653 | orchestrator | 2026-04-07 00:56:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:45.012318 | orchestrator | 2026-04-07 00:56:45 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:45.014505 | orchestrator | 2026-04-07 00:56:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:45.014570 | orchestrator | 2026-04-07 00:56:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:48.060433 | orchestrator | 2026-04-07 00:56:48 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:48.062958 | orchestrator | 2026-04-07 00:56:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:48.063046 | orchestrator | 2026-04-07 00:56:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:51.113530 | orchestrator | 2026-04-07 00:56:51 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:51.116289 | orchestrator | 2026-04-07 00:56:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:51.116365 | orchestrator | 2026-04-07 00:56:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:54.161646 | orchestrator | 2026-04-07 00:56:54 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:54.163490 | orchestrator | 2026-04-07 00:56:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:54.163532 | orchestrator | 2026-04-07 00:56:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:56:57.209342 | orchestrator | 2026-04-07 00:56:57 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:56:57.211800 | orchestrator | 2026-04-07 00:56:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:56:57.211867 | orchestrator | 2026-04-07 00:56:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:00.254854 | orchestrator | 2026-04-07 00:57:00 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:00.255865 | orchestrator | 2026-04-07 00:57:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:00.255917 | orchestrator | 2026-04-07 00:57:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:03.299914 | orchestrator | 2026-04-07 00:57:03 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:03.301690 | orchestrator | 2026-04-07 00:57:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:03.301758 | orchestrator | 2026-04-07 00:57:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:06.348235 | orchestrator | 2026-04-07 00:57:06 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:06.349840 | orchestrator | 2026-04-07 00:57:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:06.349897 | orchestrator | 2026-04-07 00:57:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:09.395905 | orchestrator | 2026-04-07 00:57:09 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:09.397027 | orchestrator | 2026-04-07 00:57:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:09.397059 | orchestrator | 2026-04-07 00:57:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:12.443258 | orchestrator | 2026-04-07 00:57:12 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:12.445026 | orchestrator | 2026-04-07 00:57:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:12.445596 | orchestrator | 2026-04-07 00:57:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:15.489905 | orchestrator | 2026-04-07 00:57:15 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:15.490467 | orchestrator | 2026-04-07 00:57:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:15.490528 | orchestrator | 2026-04-07 00:57:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:18.529388 | orchestrator | 2026-04-07 00:57:18 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:18.531863 | orchestrator | 2026-04-07 00:57:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:18.531986 | orchestrator | 2026-04-07 00:57:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:21.578984 | orchestrator | 2026-04-07 00:57:21 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:21.581613 | orchestrator | 2026-04-07 00:57:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:21.581679 | orchestrator | 2026-04-07 00:57:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:24.625181 | orchestrator | 2026-04-07 00:57:24 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:24.627592 | orchestrator | 2026-04-07 00:57:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:24.627671 | orchestrator | 2026-04-07 00:57:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:27.675763 | orchestrator | 2026-04-07 00:57:27 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:27.676654 | orchestrator | 2026-04-07 00:57:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:27.676826 | orchestrator | 2026-04-07 00:57:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:30.715553 | orchestrator | 2026-04-07 00:57:30 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state STARTED 2026-04-07 00:57:30.716996 | orchestrator | 2026-04-07 00:57:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:30.717342 | orchestrator | 2026-04-07 00:57:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:33.765937 | orchestrator | 2026-04-07 00:57:33 | INFO  | Task de76e72d-bcfe-476f-8ceb-94ece36e2cb4 is in state SUCCESS 2026-04-07 00:57:33.766998 | orchestrator | 2026-04-07 00:57:33.767039 | orchestrator | 2026-04-07 00:57:33.767049 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:57:33.767059 | orchestrator | 2026-04-07 00:57:33.767068 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:57:33.767077 | orchestrator | Tuesday 07 April 2026 00:54:36 +0000 (0:00:00.308) 0:00:00.308 ********* 2026-04-07 00:57:33.767086 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:57:33.767095 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:57:33.767194 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:57:33.767203 | orchestrator | 2026-04-07 00:57:33.767333 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:57:33.767344 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.386) 0:00:00.695 ********* 2026-04-07 00:57:33.767353 | orchestrator | ok: [testbed-node-0] => (item=enable_placement_True) 2026-04-07 00:57:33.767362 | orchestrator | ok: [testbed-node-1] => (item=enable_placement_True) 2026-04-07 00:57:33.767371 | orchestrator | ok: [testbed-node-2] => (item=enable_placement_True) 2026-04-07 00:57:33.767379 | orchestrator | 2026-04-07 00:57:33.767388 | orchestrator | PLAY [Apply role placement] **************************************************** 2026-04-07 00:57:33.767396 | orchestrator | 2026-04-07 00:57:33.767404 | orchestrator | TASK [placement : include_tasks] *********************************************** 2026-04-07 00:57:33.767412 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.376) 0:00:01.071 ********* 2026-04-07 00:57:33.767421 | orchestrator | included: /ansible/roles/placement/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:57:33.767991 | orchestrator | 2026-04-07 00:57:33.768026 | orchestrator | TASK [service-ks-register : placement | Creating/deleting services] ************ 2026-04-07 00:57:33.768057 | orchestrator | Tuesday 07 April 2026 00:54:38 +0000 (0:00:00.779) 0:00:01.851 ********* 2026-04-07 00:57:33.768066 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (5 retries left). 2026-04-07 00:57:33.768075 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (4 retries left). 2026-04-07 00:57:33.768083 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (3 retries left). 2026-04-07 00:57:33.768091 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (2 retries left). 2026-04-07 00:57:33.768127 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (1 retries left). 2026-04-07 00:57:33.768139 | orchestrator | failed: [testbed-node-0] (item=placement (placement)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Placement Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:8780"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:8780"}], "name": "placement", "type": "placement"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:57:33.768150 | orchestrator | 2026-04-07 00:57:33.768206 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:57:33.768217 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-07 00:57:33.768227 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:57:33.768236 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:57:33.768244 | orchestrator | 2026-04-07 00:57:33.768253 | orchestrator | 2026-04-07 00:57:33.768261 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:57:33.768282 | orchestrator | Tuesday 07 April 2026 00:55:32 +0000 (0:00:53.801) 0:00:55.652 ********* 2026-04-07 00:57:33.768291 | orchestrator | =============================================================================== 2026-04-07 00:57:33.768299 | orchestrator | service-ks-register : placement | Creating/deleting services ----------- 53.80s 2026-04-07 00:57:33.769169 | orchestrator | placement : include_tasks ----------------------------------------------- 0.78s 2026-04-07 00:57:33.769185 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.39s 2026-04-07 00:57:33.769193 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.38s 2026-04-07 00:57:33.769201 | orchestrator | 2026-04-07 00:57:33.769209 | orchestrator | 2026-04-07 00:57:33.769218 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 00:57:33.769225 | orchestrator | 2026-04-07 00:57:33.769234 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 00:57:33.769242 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.381) 0:00:00.381 ********* 2026-04-07 00:57:33.769250 | orchestrator | ok: [testbed-node-0] 2026-04-07 00:57:33.769259 | orchestrator | ok: [testbed-node-1] 2026-04-07 00:57:33.769267 | orchestrator | ok: [testbed-node-2] 2026-04-07 00:57:33.769275 | orchestrator | 2026-04-07 00:57:33.769283 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 00:57:33.769291 | orchestrator | Tuesday 07 April 2026 00:54:37 +0000 (0:00:00.558) 0:00:00.940 ********* 2026-04-07 00:57:33.769299 | orchestrator | ok: [testbed-node-0] => (item=enable_magnum_True) 2026-04-07 00:57:33.769308 | orchestrator | ok: [testbed-node-1] => (item=enable_magnum_True) 2026-04-07 00:57:33.769316 | orchestrator | ok: [testbed-node-2] => (item=enable_magnum_True) 2026-04-07 00:57:33.769324 | orchestrator | 2026-04-07 00:57:33.769346 | orchestrator | PLAY [Apply role magnum] ******************************************************* 2026-04-07 00:57:33.769354 | orchestrator | 2026-04-07 00:57:33.769399 | orchestrator | TASK [magnum : include_tasks] ************************************************** 2026-04-07 00:57:33.769409 | orchestrator | Tuesday 07 April 2026 00:54:38 +0000 (0:00:00.351) 0:00:01.292 ********* 2026-04-07 00:57:33.769417 | orchestrator | included: /ansible/roles/magnum/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 00:57:33.769426 | orchestrator | 2026-04-07 00:57:33.769435 | orchestrator | TASK [service-ks-register : magnum | Creating/deleting services] *************** 2026-04-07 00:57:33.769475 | orchestrator | Tuesday 07 April 2026 00:54:38 +0000 (0:00:00.632) 0:00:01.924 ********* 2026-04-07 00:57:33.769506 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (5 retries left). 2026-04-07 00:57:33.769520 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (4 retries left). 2026-04-07 00:57:33.769533 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (3 retries left). 2026-04-07 00:57:33.769545 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (2 retries left). 2026-04-07 00:57:33.769559 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (1 retries left). 2026-04-07 00:57:33.769570 | orchestrator | failed: [testbed-node-0] (item=magnum (container-infra)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Container Infrastructure Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9511/v1"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9511/v1"}], "name": "magnum", "type": "container-infra"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 00:57:33.769582 | orchestrator | 2026-04-07 00:57:33.769590 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:57:33.769599 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-07 00:57:33.769607 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:57:33.769617 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:57:33.769625 | orchestrator | 2026-04-07 00:57:33.769633 | orchestrator | 2026-04-07 00:57:33.769641 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:57:33.769649 | orchestrator | Tuesday 07 April 2026 00:55:32 +0000 (0:00:53.797) 0:00:55.721 ********* 2026-04-07 00:57:33.769657 | orchestrator | =============================================================================== 2026-04-07 00:57:33.769665 | orchestrator | service-ks-register : magnum | Creating/deleting services -------------- 53.80s 2026-04-07 00:57:33.769673 | orchestrator | magnum : include_tasks -------------------------------------------------- 0.63s 2026-04-07 00:57:33.769681 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.56s 2026-04-07 00:57:33.769689 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.35s 2026-04-07 00:57:33.769696 | orchestrator | 2026-04-07 00:57:33.769704 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-07 00:57:33.769713 | orchestrator | 2.16.14 2026-04-07 00:57:33.769721 | orchestrator | 2026-04-07 00:57:33.769731 | orchestrator | PLAY [Create ceph pools] ******************************************************* 2026-04-07 00:57:33.769741 | orchestrator | 2026-04-07 00:57:33.769750 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2026-04-07 00:57:33.769760 | orchestrator | Tuesday 07 April 2026 00:55:25 +0000 (0:00:00.540) 0:00:00.540 ********* 2026-04-07 00:57:33.769776 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:57:33.769793 | orchestrator | 2026-04-07 00:57:33.769802 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2026-04-07 00:57:33.769812 | orchestrator | Tuesday 07 April 2026 00:55:26 +0000 (0:00:00.571) 0:00:01.112 ********* 2026-04-07 00:57:33.769821 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.769830 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.769839 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.769849 | orchestrator | 2026-04-07 00:57:33.769858 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2026-04-07 00:57:33.769867 | orchestrator | Tuesday 07 April 2026 00:55:27 +0000 (0:00:00.954) 0:00:02.066 ********* 2026-04-07 00:57:33.769876 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.769886 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.769895 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.769904 | orchestrator | 2026-04-07 00:57:33.769913 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2026-04-07 00:57:33.769923 | orchestrator | Tuesday 07 April 2026 00:55:27 +0000 (0:00:00.273) 0:00:02.339 ********* 2026-04-07 00:57:33.769932 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.769941 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.769950 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.769960 | orchestrator | 2026-04-07 00:57:33.769969 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2026-04-07 00:57:33.769978 | orchestrator | Tuesday 07 April 2026 00:55:28 +0000 (0:00:00.799) 0:00:03.139 ********* 2026-04-07 00:57:33.769987 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.769996 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.770005 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.770064 | orchestrator | 2026-04-07 00:57:33.770076 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2026-04-07 00:57:33.770149 | orchestrator | Tuesday 07 April 2026 00:55:28 +0000 (0:00:00.299) 0:00:03.438 ********* 2026-04-07 00:57:33.770160 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.770169 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.770177 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.770185 | orchestrator | 2026-04-07 00:57:33.770193 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2026-04-07 00:57:33.770201 | orchestrator | Tuesday 07 April 2026 00:55:29 +0000 (0:00:00.282) 0:00:03.721 ********* 2026-04-07 00:57:33.770209 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.770217 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.770225 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.770233 | orchestrator | 2026-04-07 00:57:33.770241 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2026-04-07 00:57:33.770249 | orchestrator | Tuesday 07 April 2026 00:55:29 +0000 (0:00:00.302) 0:00:04.024 ********* 2026-04-07 00:57:33.770257 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.770265 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.770273 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.770281 | orchestrator | 2026-04-07 00:57:33.770289 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2026-04-07 00:57:33.770297 | orchestrator | Tuesday 07 April 2026 00:55:29 +0000 (0:00:00.453) 0:00:04.477 ********* 2026-04-07 00:57:33.770305 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.770313 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.770321 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.770329 | orchestrator | 2026-04-07 00:57:33.770336 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2026-04-07 00:57:33.770344 | orchestrator | Tuesday 07 April 2026 00:55:30 +0000 (0:00:00.294) 0:00:04.772 ********* 2026-04-07 00:57:33.770352 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:57:33.770360 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:57:33.770368 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:57:33.770382 | orchestrator | 2026-04-07 00:57:33.770390 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2026-04-07 00:57:33.770398 | orchestrator | Tuesday 07 April 2026 00:55:30 +0000 (0:00:00.628) 0:00:05.400 ********* 2026-04-07 00:57:33.770406 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.770414 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.770422 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.770430 | orchestrator | 2026-04-07 00:57:33.770442 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2026-04-07 00:57:33.770456 | orchestrator | Tuesday 07 April 2026 00:55:31 +0000 (0:00:00.394) 0:00:05.795 ********* 2026-04-07 00:57:33.770469 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:57:33.770482 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:57:33.770495 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:57:33.770508 | orchestrator | 2026-04-07 00:57:33.770522 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2026-04-07 00:57:33.770536 | orchestrator | Tuesday 07 April 2026 00:55:34 +0000 (0:00:02.979) 0:00:08.774 ********* 2026-04-07 00:57:33.770550 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-07 00:57:33.770565 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-07 00:57:33.770579 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-07 00:57:33.770587 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.770596 | orchestrator | 2026-04-07 00:57:33.770604 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2026-04-07 00:57:33.770612 | orchestrator | Tuesday 07 April 2026 00:55:34 +0000 (0:00:00.349) 0:00:09.124 ********* 2026-04-07 00:57:33.770622 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.770638 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.770647 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.770655 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.770663 | orchestrator | 2026-04-07 00:57:33.770671 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2026-04-07 00:57:33.770679 | orchestrator | Tuesday 07 April 2026 00:55:35 +0000 (0:00:00.629) 0:00:09.754 ********* 2026-04-07 00:57:33.770689 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.770730 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.770740 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.770755 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.770763 | orchestrator | 2026-04-07 00:57:33.770771 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2026-04-07 00:57:33.770779 | orchestrator | Tuesday 07 April 2026 00:55:35 +0000 (0:00:00.138) 0:00:09.892 ********* 2026-04-07 00:57:33.770789 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': 'b848273fc6ee', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2026-04-07 00:55:32.111872', 'end': '2026-04-07 00:55:32.153010', 'delta': '0:00:00.041138', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['b848273fc6ee'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}) 2026-04-07 00:57:33.770801 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '1ebb1786781d', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2026-04-07 00:55:33.169237', 'end': '2026-04-07 00:55:33.204278', 'delta': '0:00:00.035041', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['1ebb1786781d'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}) 2026-04-07 00:57:33.770813 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '70e8a27bf433', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2026-04-07 00:55:33.946059', 'end': '2026-04-07 00:55:33.986775', 'delta': '0:00:00.040716', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['70e8a27bf433'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}) 2026-04-07 00:57:33.770822 | orchestrator | 2026-04-07 00:57:33.770830 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2026-04-07 00:57:33.770838 | orchestrator | Tuesday 07 April 2026 00:55:35 +0000 (0:00:00.273) 0:00:10.165 ********* 2026-04-07 00:57:33.770846 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.770855 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.770863 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.770871 | orchestrator | 2026-04-07 00:57:33.770879 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2026-04-07 00:57:33.770887 | orchestrator | Tuesday 07 April 2026 00:55:35 +0000 (0:00:00.365) 0:00:10.531 ********* 2026-04-07 00:57:33.770895 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] 2026-04-07 00:57:33.770903 | orchestrator | 2026-04-07 00:57:33.770911 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2026-04-07 00:57:33.770919 | orchestrator | Tuesday 07 April 2026 00:55:37 +0000 (0:00:01.835) 0:00:12.367 ********* 2026-04-07 00:57:33.770927 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.770935 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.770948 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.770956 | orchestrator | 2026-04-07 00:57:33.770964 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2026-04-07 00:57:33.770972 | orchestrator | Tuesday 07 April 2026 00:55:37 +0000 (0:00:00.261) 0:00:12.628 ********* 2026-04-07 00:57:33.771001 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771011 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771019 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771027 | orchestrator | 2026-04-07 00:57:33.771035 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-07 00:57:33.771043 | orchestrator | Tuesday 07 April 2026 00:55:38 +0000 (0:00:00.415) 0:00:13.044 ********* 2026-04-07 00:57:33.771051 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771059 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771067 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771075 | orchestrator | 2026-04-07 00:57:33.771083 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2026-04-07 00:57:33.771091 | orchestrator | Tuesday 07 April 2026 00:55:38 +0000 (0:00:00.472) 0:00:13.516 ********* 2026-04-07 00:57:33.771126 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.771136 | orchestrator | 2026-04-07 00:57:33.771144 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2026-04-07 00:57:33.771152 | orchestrator | Tuesday 07 April 2026 00:55:38 +0000 (0:00:00.128) 0:00:13.645 ********* 2026-04-07 00:57:33.771160 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771168 | orchestrator | 2026-04-07 00:57:33.771176 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-07 00:57:33.771184 | orchestrator | Tuesday 07 April 2026 00:55:39 +0000 (0:00:00.216) 0:00:13.862 ********* 2026-04-07 00:57:33.771192 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771200 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771208 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771215 | orchestrator | 2026-04-07 00:57:33.771223 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2026-04-07 00:57:33.771231 | orchestrator | Tuesday 07 April 2026 00:55:39 +0000 (0:00:00.278) 0:00:14.140 ********* 2026-04-07 00:57:33.771239 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771247 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771255 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771262 | orchestrator | 2026-04-07 00:57:33.771270 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2026-04-07 00:57:33.771278 | orchestrator | Tuesday 07 April 2026 00:55:39 +0000 (0:00:00.316) 0:00:14.457 ********* 2026-04-07 00:57:33.771286 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771294 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771302 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771310 | orchestrator | 2026-04-07 00:57:33.771318 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2026-04-07 00:57:33.771325 | orchestrator | Tuesday 07 April 2026 00:55:40 +0000 (0:00:00.499) 0:00:14.956 ********* 2026-04-07 00:57:33.771333 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771341 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771349 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771357 | orchestrator | 2026-04-07 00:57:33.771365 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2026-04-07 00:57:33.771373 | orchestrator | Tuesday 07 April 2026 00:55:40 +0000 (0:00:00.310) 0:00:15.267 ********* 2026-04-07 00:57:33.771380 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771388 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771396 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771404 | orchestrator | 2026-04-07 00:57:33.771412 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2026-04-07 00:57:33.771420 | orchestrator | Tuesday 07 April 2026 00:55:40 +0000 (0:00:00.310) 0:00:15.578 ********* 2026-04-07 00:57:33.771434 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771448 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771461 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771474 | orchestrator | 2026-04-07 00:57:33.771488 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2026-04-07 00:57:33.771502 | orchestrator | Tuesday 07 April 2026 00:55:41 +0000 (0:00:00.309) 0:00:15.887 ********* 2026-04-07 00:57:33.771516 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771527 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.771535 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.771543 | orchestrator | 2026-04-07 00:57:33.771551 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2026-04-07 00:57:33.771564 | orchestrator | Tuesday 07 April 2026 00:55:41 +0000 (0:00:00.473) 0:00:16.360 ********* 2026-04-07 00:57:33.771574 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2', 'dm-uuid-LVM-qrJ0lEo0sbfKYWJnOUkfPiYNIdhxxy3DFJOxYc3XSynkbT8r9ZAsZinTdj4C3pwv'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771612 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722', 'dm-uuid-LVM-nUbhE8JyxWI4yIlTiMfwGTfCsIQCTAaXH2kS21Y9fbfPK9wfe5kU86dUi9uvkF2I'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771622 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771631 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771639 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771648 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771656 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771670 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771682 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771691 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771725 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.771737 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-1uSdjJ-FReU-Bejh-1mIK-IacW-k7Ls-RcaZks', 'scsi-0QEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76', 'scsi-SQEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.771757 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2e1P9d-pfP2-eXUQ-ccQa-yPfw-fE2I-SCS1ff', 'scsi-0QEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88', 'scsi-SQEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.771767 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d', 'scsi-SQEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.771797 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-22-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.771807 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.771816 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b', 'dm-uuid-LVM-yC88MiN3PryvE0fvbIhwr0IrRujbsAZCfMAG8Wujapp4JvfYCrcYMxGdRouUeoG9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771824 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4', 'dm-uuid-LVM-uQuuByfRrNJe2RSEgkC5hvKAsqpHeQMR8CPgZcXk6LO0dL9kCsyBt1HiTJmi4USt'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771833 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771851 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771860 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771872 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771881 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c', 'dm-uuid-LVM-78tLYoniV2zuzKbpFSVYh6asI7K2E633YlBjjslh7SRkoyZBrDaNagtVPi2vq3sj'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771890 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771933 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7', 'dm-uuid-LVM-ARf5D8B94Jgn5F8asnJUBcF8eZEuUPcfWT1TpcC3liLoLTzUKcmVjwveJKBatcEE'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771953 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771967 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.771989 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772003 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772017 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772036 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772061 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772081 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772090 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-u2UQbS-QvW9-d0NA-ibFW-i6XF-xBrx-eZdX0A', 'scsi-0QEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba', 'scsi-SQEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772123 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772137 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-7nqGLU-eOnu-DWE9-Bjej-9NH7-c88D-6HT0bS', 'scsi-0QEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6', 'scsi-SQEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772146 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772162 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696', 'scsi-SQEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772171 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772184 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-17-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772193 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.772202 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-07 00:57:33.772215 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772230 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-krTUGj-5S7a-soWH-3nId-RGto-83dV-k7X561', 'scsi-0QEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd', 'scsi-SQEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772244 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-vSHFoZ-edkC-yorr-qFsk-jrGs-AtMj-CzgncF', 'scsi-0QEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349', 'scsi-SQEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772253 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8', 'scsi-SQEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772261 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-07 00:57:33.772270 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.772278 | orchestrator | 2026-04-07 00:57:33.772286 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2026-04-07 00:57:33.772298 | orchestrator | Tuesday 07 April 2026 00:55:42 +0000 (0:00:00.527) 0:00:16.888 ********* 2026-04-07 00:57:33.772307 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2', 'dm-uuid-LVM-qrJ0lEo0sbfKYWJnOUkfPiYNIdhxxy3DFJOxYc3XSynkbT8r9ZAsZinTdj4C3pwv'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772323 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722', 'dm-uuid-LVM-nUbhE8JyxWI4yIlTiMfwGTfCsIQCTAaXH2kS21Y9fbfPK9wfe5kU86dUi9uvkF2I'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772338 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772346 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772355 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772366 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772375 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772383 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772397 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b', 'dm-uuid-LVM-yC88MiN3PryvE0fvbIhwr0IrRujbsAZCfMAG8Wujapp4JvfYCrcYMxGdRouUeoG9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772410 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772419 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4', 'dm-uuid-LVM-uQuuByfRrNJe2RSEgkC5hvKAsqpHeQMR8CPgZcXk6LO0dL9kCsyBt1HiTJmi4USt'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772427 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772441 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772470 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16', 'scsi-SQEMU_QEMU_HARDDISK_5f996502-1da9-49e5-9e1f-f2a253186967-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772499 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772523 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--e0113da9--ca02--59fe--bdca--d5482abf5fe2-osd--block--e0113da9--ca02--59fe--bdca--d5482abf5fe2'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-1uSdjJ-FReU-Bejh-1mIK-IacW-k7Ls-RcaZks', 'scsi-0QEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76', 'scsi-SQEMU_QEMU_HARDDISK_0aceb24c-1141-4b89-81c4-2bd069400a76'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772536 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772555 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--9eeb51fd--cca7--5129--bb0c--15bc93c67722-osd--block--9eeb51fd--cca7--5129--bb0c--15bc93c67722'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2e1P9d-pfP2-eXUQ-ccQa-yPfw-fE2I-SCS1ff', 'scsi-0QEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88', 'scsi-SQEMU_QEMU_HARDDISK_ee2515b7-1de0-4cb8-a492-67bb0415ec88'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772575 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d', 'scsi-SQEMU_QEMU_HARDDISK_d98a6229-64c7-4f26-837e-eda0f824cf1d'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772587 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772599 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-22-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772617 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772631 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.772644 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772670 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772683 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772703 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16', 'scsi-SQEMU_QEMU_HARDDISK_3bcc4df9-bd9b-42b2-a91b-3b385d323e82-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772727 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--f75c5f18--ff10--5900--9978--917c146f798b-osd--block--f75c5f18--ff10--5900--9978--917c146f798b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-u2UQbS-QvW9-d0NA-ibFW-i6XF-xBrx-eZdX0A', 'scsi-0QEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba', 'scsi-SQEMU_QEMU_HARDDISK_967b79e7-41ef-439c-974d-46e00c7544ba'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772751 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c', 'dm-uuid-LVM-78tLYoniV2zuzKbpFSVYh6asI7K2E633YlBjjslh7SRkoyZBrDaNagtVPi2vq3sj'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772765 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--47815a29--012a--570b--a074--b4436c47a2f4-osd--block--47815a29--012a--570b--a074--b4436c47a2f4'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-7nqGLU-eOnu-DWE9-Bjej-9NH7-c88D-6HT0bS', 'scsi-0QEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6', 'scsi-SQEMU_QEMU_HARDDISK_18dce6fc-4f14-415a-9461-5b764394eff6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772787 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696', 'scsi-SQEMU_QEMU_HARDDISK_1469229d-4b75-4251-a9b8-5b75cda4a696'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772797 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7', 'dm-uuid-LVM-ARf5D8B94Jgn5F8asnJUBcF8eZEuUPcfWT1TpcC3liLoLTzUKcmVjwveJKBatcEE'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772818 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-17-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772826 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772835 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.772843 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772852 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772860 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772873 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772887 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772900 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772908 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772922 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16', 'scsi-SQEMU_QEMU_HARDDISK_4b41c8c8-cb46-4d77-a16f-c33d82450db9-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772940 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--0842dd12--8111--558f--8152--9e8987e1446c-osd--block--0842dd12--8111--558f--8152--9e8987e1446c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-krTUGj-5S7a-soWH-3nId-RGto-83dV-k7X561', 'scsi-0QEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd', 'scsi-SQEMU_QEMU_HARDDISK_d9b6b982-5d2c-47ad-95ce-6e4d358a27cd'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772950 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--e59b5a6a--4894--5883--a5b3--f677d5bde0c7-osd--block--e59b5a6a--4894--5883--a5b3--f677d5bde0c7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-vSHFoZ-edkC-yorr-qFsk-jrGs-AtMj-CzgncF', 'scsi-0QEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349', 'scsi-SQEMU_QEMU_HARDDISK_61826d0c-ccdc-4393-b392-5dc26cd19349'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772959 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8', 'scsi-SQEMU_QEMU_HARDDISK_e06458de-fcc8-49b9-b479-fcb02169b5c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772971 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-07-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-07 00:57:33.772980 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.772988 | orchestrator | 2026-04-07 00:57:33.772996 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2026-04-07 00:57:33.773004 | orchestrator | Tuesday 07 April 2026 00:55:42 +0000 (0:00:00.584) 0:00:17.472 ********* 2026-04-07 00:57:33.773013 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.773026 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.773034 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.773042 | orchestrator | 2026-04-07 00:57:33.773050 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2026-04-07 00:57:33.773063 | orchestrator | Tuesday 07 April 2026 00:55:43 +0000 (0:00:00.652) 0:00:18.124 ********* 2026-04-07 00:57:33.773080 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.773152 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.773168 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.773180 | orchestrator | 2026-04-07 00:57:33.773192 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-07 00:57:33.773205 | orchestrator | Tuesday 07 April 2026 00:55:43 +0000 (0:00:00.439) 0:00:18.563 ********* 2026-04-07 00:57:33.773217 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.773230 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.773243 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.773256 | orchestrator | 2026-04-07 00:57:33.773270 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-07 00:57:33.773284 | orchestrator | Tuesday 07 April 2026 00:55:44 +0000 (0:00:00.689) 0:00:19.253 ********* 2026-04-07 00:57:33.773292 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773301 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773313 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773326 | orchestrator | 2026-04-07 00:57:33.773340 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-07 00:57:33.773353 | orchestrator | Tuesday 07 April 2026 00:55:44 +0000 (0:00:00.267) 0:00:19.520 ********* 2026-04-07 00:57:33.773375 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773389 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773403 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773416 | orchestrator | 2026-04-07 00:57:33.773430 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-07 00:57:33.773440 | orchestrator | Tuesday 07 April 2026 00:55:45 +0000 (0:00:00.410) 0:00:19.930 ********* 2026-04-07 00:57:33.773448 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773456 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773464 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773472 | orchestrator | 2026-04-07 00:57:33.773480 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2026-04-07 00:57:33.773488 | orchestrator | Tuesday 07 April 2026 00:55:45 +0000 (0:00:00.491) 0:00:20.422 ********* 2026-04-07 00:57:33.773496 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2026-04-07 00:57:33.773504 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2026-04-07 00:57:33.773512 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2026-04-07 00:57:33.773520 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2026-04-07 00:57:33.773528 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2026-04-07 00:57:33.773536 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2026-04-07 00:57:33.773544 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2026-04-07 00:57:33.773552 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2026-04-07 00:57:33.773559 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2026-04-07 00:57:33.773567 | orchestrator | 2026-04-07 00:57:33.773576 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2026-04-07 00:57:33.773584 | orchestrator | Tuesday 07 April 2026 00:55:46 +0000 (0:00:00.872) 0:00:21.295 ********* 2026-04-07 00:57:33.773592 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-07 00:57:33.773600 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-07 00:57:33.773608 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-07 00:57:33.773615 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773623 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-04-07 00:57:33.773631 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-04-07 00:57:33.773647 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-04-07 00:57:33.773655 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773663 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-04-07 00:57:33.773670 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-04-07 00:57:33.773677 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-04-07 00:57:33.773684 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773690 | orchestrator | 2026-04-07 00:57:33.773697 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2026-04-07 00:57:33.773704 | orchestrator | Tuesday 07 April 2026 00:55:46 +0000 (0:00:00.355) 0:00:21.650 ********* 2026-04-07 00:57:33.773711 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 00:57:33.773718 | orchestrator | 2026-04-07 00:57:33.773725 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-04-07 00:57:33.773732 | orchestrator | Tuesday 07 April 2026 00:55:47 +0000 (0:00:00.700) 0:00:22.350 ********* 2026-04-07 00:57:33.773739 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773746 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773753 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773759 | orchestrator | 2026-04-07 00:57:33.773766 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-04-07 00:57:33.773773 | orchestrator | Tuesday 07 April 2026 00:55:48 +0000 (0:00:00.338) 0:00:22.689 ********* 2026-04-07 00:57:33.773780 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773787 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773793 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773800 | orchestrator | 2026-04-07 00:57:33.773815 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-04-07 00:57:33.773822 | orchestrator | Tuesday 07 April 2026 00:55:48 +0000 (0:00:00.314) 0:00:23.004 ********* 2026-04-07 00:57:33.773829 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773835 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.773842 | orchestrator | skipping: [testbed-node-5] 2026-04-07 00:57:33.773849 | orchestrator | 2026-04-07 00:57:33.773856 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-04-07 00:57:33.773862 | orchestrator | Tuesday 07 April 2026 00:55:48 +0000 (0:00:00.295) 0:00:23.300 ********* 2026-04-07 00:57:33.773869 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.773876 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.773883 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.773889 | orchestrator | 2026-04-07 00:57:33.773896 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-04-07 00:57:33.773903 | orchestrator | Tuesday 07 April 2026 00:55:49 +0000 (0:00:00.610) 0:00:23.910 ********* 2026-04-07 00:57:33.773910 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:57:33.773916 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:57:33.773923 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:57:33.773930 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773936 | orchestrator | 2026-04-07 00:57:33.773943 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-04-07 00:57:33.773950 | orchestrator | Tuesday 07 April 2026 00:55:49 +0000 (0:00:00.352) 0:00:24.262 ********* 2026-04-07 00:57:33.773957 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:57:33.773964 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:57:33.773970 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:57:33.773981 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.773988 | orchestrator | 2026-04-07 00:57:33.773995 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-04-07 00:57:33.774008 | orchestrator | Tuesday 07 April 2026 00:55:49 +0000 (0:00:00.345) 0:00:24.608 ********* 2026-04-07 00:57:33.774058 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-07 00:57:33.774069 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-07 00:57:33.774081 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-07 00:57:33.774092 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.774148 | orchestrator | 2026-04-07 00:57:33.774159 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-04-07 00:57:33.774169 | orchestrator | Tuesday 07 April 2026 00:55:50 +0000 (0:00:00.341) 0:00:24.949 ********* 2026-04-07 00:57:33.774179 | orchestrator | ok: [testbed-node-3] 2026-04-07 00:57:33.774189 | orchestrator | ok: [testbed-node-4] 2026-04-07 00:57:33.774199 | orchestrator | ok: [testbed-node-5] 2026-04-07 00:57:33.774208 | orchestrator | 2026-04-07 00:57:33.774218 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-04-07 00:57:33.774227 | orchestrator | Tuesday 07 April 2026 00:55:50 +0000 (0:00:00.326) 0:00:25.276 ********* 2026-04-07 00:57:33.774237 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-04-07 00:57:33.774246 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-07 00:57:33.774255 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-07 00:57:33.774265 | orchestrator | 2026-04-07 00:57:33.774276 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2026-04-07 00:57:33.774286 | orchestrator | Tuesday 07 April 2026 00:55:51 +0000 (0:00:00.505) 0:00:25.781 ********* 2026-04-07 00:57:33.774297 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:57:33.774308 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:57:33.774319 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:57:33.774328 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-07 00:57:33.774339 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-07 00:57:33.774349 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-07 00:57:33.774359 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-07 00:57:33.774369 | orchestrator | 2026-04-07 00:57:33.774378 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2026-04-07 00:57:33.774388 | orchestrator | Tuesday 07 April 2026 00:55:52 +0000 (0:00:00.954) 0:00:26.736 ********* 2026-04-07 00:57:33.774398 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-07 00:57:33.774408 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-07 00:57:33.774418 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-07 00:57:33.774428 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-07 00:57:33.774439 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-07 00:57:33.774450 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-07 00:57:33.774460 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-07 00:57:33.774471 | orchestrator | 2026-04-07 00:57:33.774482 | orchestrator | TASK [Include tasks from the ceph-osd role] ************************************ 2026-04-07 00:57:33.774493 | orchestrator | Tuesday 07 April 2026 00:55:53 +0000 (0:00:01.910) 0:00:28.647 ********* 2026-04-07 00:57:33.774503 | orchestrator | skipping: [testbed-node-3] 2026-04-07 00:57:33.774515 | orchestrator | skipping: [testbed-node-4] 2026-04-07 00:57:33.774527 | orchestrator | included: /ansible/tasks/openstack_config.yml for testbed-node-5 2026-04-07 00:57:33.774534 | orchestrator | 2026-04-07 00:57:33.774547 | orchestrator | TASK [create openstack pool(s)] ************************************************ 2026-04-07 00:57:33.774561 | orchestrator | Tuesday 07 April 2026 00:55:54 +0000 (0:00:00.363) 0:00:29.011 ********* 2026-04-07 00:57:33.774569 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'backups', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:57:33.774578 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'volumes', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:57:33.774584 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'images', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:57:33.774603 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'metrics', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:57:33.774613 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'vms', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-07 00:57:33.774623 | orchestrator | 2026-04-07 00:57:33.774633 | orchestrator | TASK [generate keys] *********************************************************** 2026-04-07 00:57:33.774642 | orchestrator | Tuesday 07 April 2026 00:56:38 +0000 (0:00:44.316) 0:01:13.327 ********* 2026-04-07 00:57:33.774652 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774662 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774672 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774682 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774692 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774702 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774712 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] }}] 2026-04-07 00:57:33.774723 | orchestrator | 2026-04-07 00:57:33.774733 | orchestrator | TASK [get keys from monitors] ************************************************** 2026-04-07 00:57:33.774744 | orchestrator | Tuesday 07 April 2026 00:57:02 +0000 (0:00:23.874) 0:01:37.202 ********* 2026-04-07 00:57:33.774755 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774765 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774775 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774785 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774791 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774798 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774804 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-07 00:57:33.774810 | orchestrator | 2026-04-07 00:57:33.774816 | orchestrator | TASK [copy ceph key(s) if needed] ********************************************** 2026-04-07 00:57:33.774823 | orchestrator | Tuesday 07 April 2026 00:57:14 +0000 (0:00:12.251) 0:01:49.453 ********* 2026-04-07 00:57:33.774836 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774842 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:57:33.774848 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:57:33.774855 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774861 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:57:33.774867 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:57:33.774873 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774880 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:57:33.774886 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:57:33.774897 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774903 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:57:33.774909 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:57:33.774916 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774922 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:57:33.774928 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:57:33.774934 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-07 00:57:33.774941 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-07 00:57:33.774947 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-07 00:57:33.774953 | orchestrator | changed: [testbed-node-5 -> {{ item.1 }}] 2026-04-07 00:57:33.774960 | orchestrator | 2026-04-07 00:57:33.774966 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:57:33.774972 | orchestrator | testbed-node-3 : ok=25  changed=0 unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2026-04-07 00:57:33.774979 | orchestrator | testbed-node-4 : ok=18  changed=0 unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2026-04-07 00:57:33.774995 | orchestrator | testbed-node-5 : ok=23  changed=3  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2026-04-07 00:57:33.775002 | orchestrator | 2026-04-07 00:57:33.775008 | orchestrator | 2026-04-07 00:57:33.775014 | orchestrator | 2026-04-07 00:57:33.775021 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:57:33.775027 | orchestrator | Tuesday 07 April 2026 00:57:32 +0000 (0:00:17.728) 0:02:07.182 ********* 2026-04-07 00:57:33.775033 | orchestrator | =============================================================================== 2026-04-07 00:57:33.775039 | orchestrator | create openstack pool(s) ----------------------------------------------- 44.32s 2026-04-07 00:57:33.775046 | orchestrator | generate keys ---------------------------------------------------------- 23.87s 2026-04-07 00:57:33.775052 | orchestrator | copy ceph key(s) if needed --------------------------------------------- 17.73s 2026-04-07 00:57:33.775058 | orchestrator | get keys from monitors ------------------------------------------------- 12.25s 2026-04-07 00:57:33.775064 | orchestrator | ceph-facts : Find a running mon container ------------------------------- 2.98s 2026-04-07 00:57:33.775071 | orchestrator | ceph-facts : Set_fact ceph_admin_command -------------------------------- 1.91s 2026-04-07 00:57:33.775079 | orchestrator | ceph-facts : Get current fsid if cluster is already running ------------- 1.84s 2026-04-07 00:57:33.775089 | orchestrator | ceph-facts : Check if it is atomic host --------------------------------- 0.95s 2026-04-07 00:57:33.775123 | orchestrator | ceph-facts : Set_fact ceph_run_cmd -------------------------------------- 0.95s 2026-04-07 00:57:33.775135 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 0.87s 2026-04-07 00:57:33.775145 | orchestrator | ceph-facts : Check if podman binary is present -------------------------- 0.80s 2026-04-07 00:57:33.775155 | orchestrator | ceph-facts : Import_tasks set_radosgw_address.yml ----------------------- 0.70s 2026-04-07 00:57:33.775165 | orchestrator | ceph-facts : Read osd pool default crush rule --------------------------- 0.69s 2026-04-07 00:57:33.775173 | orchestrator | ceph-facts : Check if the ceph conf exists ------------------------------ 0.65s 2026-04-07 00:57:33.775184 | orchestrator | ceph-facts : Check if the ceph mon socket is in-use --------------------- 0.63s 2026-04-07 00:57:33.775194 | orchestrator | ceph-facts : Set_fact monitor_name ansible_facts['hostname'] ------------ 0.63s 2026-04-07 00:57:33.775205 | orchestrator | ceph-facts : Set_fact _radosgw_address to radosgw_address --------------- 0.61s 2026-04-07 00:57:33.775216 | orchestrator | ceph-facts : Set_fact devices generate device list when osd_auto_discovery --- 0.58s 2026-04-07 00:57:33.775225 | orchestrator | ceph-facts : Include facts.yml ------------------------------------------ 0.57s 2026-04-07 00:57:33.775231 | orchestrator | ceph-facts : Collect existed devices ------------------------------------ 0.53s 2026-04-07 00:57:33.775237 | orchestrator | 2026-04-07 00:57:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:33.775244 | orchestrator | 2026-04-07 00:57:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:36.820131 | orchestrator | 2026-04-07 00:57:36 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:36.820465 | orchestrator | 2026-04-07 00:57:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:36.820505 | orchestrator | 2026-04-07 00:57:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:39.857790 | orchestrator | 2026-04-07 00:57:39 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:39.860025 | orchestrator | 2026-04-07 00:57:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:39.860085 | orchestrator | 2026-04-07 00:57:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:42.901098 | orchestrator | 2026-04-07 00:57:42 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:42.901941 | orchestrator | 2026-04-07 00:57:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:42.901984 | orchestrator | 2026-04-07 00:57:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:45.943259 | orchestrator | 2026-04-07 00:57:45 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:45.945513 | orchestrator | 2026-04-07 00:57:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:45.945881 | orchestrator | 2026-04-07 00:57:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:48.999330 | orchestrator | 2026-04-07 00:57:48 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:49.001140 | orchestrator | 2026-04-07 00:57:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:49.001166 | orchestrator | 2026-04-07 00:57:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:52.051012 | orchestrator | 2026-04-07 00:57:52 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:52.051355 | orchestrator | 2026-04-07 00:57:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:52.051382 | orchestrator | 2026-04-07 00:57:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:55.097368 | orchestrator | 2026-04-07 00:57:55 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:55.099729 | orchestrator | 2026-04-07 00:57:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:55.100079 | orchestrator | 2026-04-07 00:57:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:57:58.142508 | orchestrator | 2026-04-07 00:57:58 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:57:58.145147 | orchestrator | 2026-04-07 00:57:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:57:58.145798 | orchestrator | 2026-04-07 00:57:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:01.191546 | orchestrator | 2026-04-07 00:58:01 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:58:01.193188 | orchestrator | 2026-04-07 00:58:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:01.193238 | orchestrator | 2026-04-07 00:58:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:04.240840 | orchestrator | 2026-04-07 00:58:04 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:58:04.242402 | orchestrator | 2026-04-07 00:58:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:04.242472 | orchestrator | 2026-04-07 00:58:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:07.293087 | orchestrator | 2026-04-07 00:58:07 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state STARTED 2026-04-07 00:58:07.294686 | orchestrator | 2026-04-07 00:58:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:07.294782 | orchestrator | 2026-04-07 00:58:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:10.344741 | orchestrator | 2026-04-07 00:58:10 | INFO  | Task e896858a-59fc-43d9-914b-e0b6426090ef is in state SUCCESS 2026-04-07 00:58:10.345943 | orchestrator | 2026-04-07 00:58:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:10.346004 | orchestrator | 2026-04-07 00:58:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:13.407202 | orchestrator | 2026-04-07 00:58:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:13.408789 | orchestrator | 2026-04-07 00:58:13 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:13.408855 | orchestrator | 2026-04-07 00:58:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:16.446876 | orchestrator | 2026-04-07 00:58:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:16.446959 | orchestrator | 2026-04-07 00:58:16 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:16.446968 | orchestrator | 2026-04-07 00:58:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:19.484947 | orchestrator | 2026-04-07 00:58:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:19.487017 | orchestrator | 2026-04-07 00:58:19 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:19.487080 | orchestrator | 2026-04-07 00:58:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:22.526348 | orchestrator | 2026-04-07 00:58:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:22.526439 | orchestrator | 2026-04-07 00:58:22 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:22.526479 | orchestrator | 2026-04-07 00:58:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:25.579380 | orchestrator | 2026-04-07 00:58:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:25.579460 | orchestrator | 2026-04-07 00:58:25 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:25.579467 | orchestrator | 2026-04-07 00:58:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:28.615338 | orchestrator | 2026-04-07 00:58:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:28.616594 | orchestrator | 2026-04-07 00:58:28 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:28.616660 | orchestrator | 2026-04-07 00:58:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:31.665528 | orchestrator | 2026-04-07 00:58:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:31.666671 | orchestrator | 2026-04-07 00:58:31 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:31.667542 | orchestrator | 2026-04-07 00:58:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:34.718315 | orchestrator | 2026-04-07 00:58:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:34.719814 | orchestrator | 2026-04-07 00:58:34 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:34.719905 | orchestrator | 2026-04-07 00:58:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:37.759316 | orchestrator | 2026-04-07 00:58:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:37.759698 | orchestrator | 2026-04-07 00:58:37 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:37.759732 | orchestrator | 2026-04-07 00:58:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:40.805105 | orchestrator | 2026-04-07 00:58:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:40.805958 | orchestrator | 2026-04-07 00:58:40 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:40.806067 | orchestrator | 2026-04-07 00:58:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:43.847731 | orchestrator | 2026-04-07 00:58:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:43.849579 | orchestrator | 2026-04-07 00:58:43 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:43.849621 | orchestrator | 2026-04-07 00:58:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:46.892222 | orchestrator | 2026-04-07 00:58:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:46.893611 | orchestrator | 2026-04-07 00:58:46 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:46.893663 | orchestrator | 2026-04-07 00:58:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:49.940058 | orchestrator | 2026-04-07 00:58:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:49.941382 | orchestrator | 2026-04-07 00:58:49 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:49.941439 | orchestrator | 2026-04-07 00:58:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:52.979101 | orchestrator | 2026-04-07 00:58:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:52.981008 | orchestrator | 2026-04-07 00:58:52 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:52.981128 | orchestrator | 2026-04-07 00:58:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:56.033606 | orchestrator | 2026-04-07 00:58:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:56.034117 | orchestrator | 2026-04-07 00:58:56 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:56.034494 | orchestrator | 2026-04-07 00:58:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:58:59.074795 | orchestrator | 2026-04-07 00:58:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:58:59.075136 | orchestrator | 2026-04-07 00:58:59 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:58:59.075165 | orchestrator | 2026-04-07 00:58:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:02.117820 | orchestrator | 2026-04-07 00:59:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:02.118969 | orchestrator | 2026-04-07 00:59:02 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:59:02.119006 | orchestrator | 2026-04-07 00:59:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:05.159247 | orchestrator | 2026-04-07 00:59:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:05.160764 | orchestrator | 2026-04-07 00:59:05 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:59:05.160826 | orchestrator | 2026-04-07 00:59:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:08.208266 | orchestrator | 2026-04-07 00:59:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:08.209610 | orchestrator | 2026-04-07 00:59:08 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state STARTED 2026-04-07 00:59:08.209651 | orchestrator | 2026-04-07 00:59:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:11.262149 | orchestrator | 2026-04-07 00:59:11 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:11.263842 | orchestrator | 2026-04-07 00:59:11 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:11.264700 | orchestrator | 2026-04-07 00:59:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:11.265957 | orchestrator | 2026-04-07 00:59:11 | INFO  | Task 442a566d-274a-445d-a0a5-a11e5fdb1ed4 is in state SUCCESS 2026-04-07 00:59:11.266182 | orchestrator | 2026-04-07 00:59:11.266218 | orchestrator | 2026-04-07 00:59:11.266224 | orchestrator | PLAY [Copy ceph keys to the configuration repository] ************************** 2026-04-07 00:59:11.266230 | orchestrator | 2026-04-07 00:59:11.266235 | orchestrator | TASK [Check if ceph keys exist] ************************************************ 2026-04-07 00:59:11.266241 | orchestrator | Tuesday 07 April 2026 00:57:36 +0000 (0:00:00.223) 0:00:00.223 ********* 2026-04-07 00:59:11.266246 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2026-04-07 00:59:11.266252 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266256 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266261 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2026-04-07 00:59:11.266266 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266291 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2026-04-07 00:59:11.266296 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2026-04-07 00:59:11.266301 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2026-04-07 00:59:11.266306 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2026-04-07 00:59:11.266310 | orchestrator | 2026-04-07 00:59:11.266315 | orchestrator | TASK [Fetch all ceph keys] ***************************************************** 2026-04-07 00:59:11.266320 | orchestrator | Tuesday 07 April 2026 00:57:41 +0000 (0:00:04.961) 0:00:05.184 ********* 2026-04-07 00:59:11.266324 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2026-04-07 00:59:11.266329 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266333 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266338 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2026-04-07 00:59:11.266342 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266347 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2026-04-07 00:59:11.266351 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2026-04-07 00:59:11.266368 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2026-04-07 00:59:11.266373 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2026-04-07 00:59:11.266377 | orchestrator | 2026-04-07 00:59:11.266382 | orchestrator | TASK [Create share directory] ************************************************** 2026-04-07 00:59:11.266386 | orchestrator | Tuesday 07 April 2026 00:57:45 +0000 (0:00:04.314) 0:00:09.499 ********* 2026-04-07 00:59:11.266392 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-07 00:59:11.266396 | orchestrator | 2026-04-07 00:59:11.266401 | orchestrator | TASK [Write ceph keys to the share directory] ********************************** 2026-04-07 00:59:11.266405 | orchestrator | Tuesday 07 April 2026 00:57:46 +0000 (0:00:01.068) 0:00:10.568 ********* 2026-04-07 00:59:11.266410 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.admin.keyring) 2026-04-07 00:59:11.266414 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266419 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266423 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder-backup.keyring) 2026-04-07 00:59:11.266427 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266431 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.nova.keyring) 2026-04-07 00:59:11.266436 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.glance.keyring) 2026-04-07 00:59:11.266440 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.gnocchi.keyring) 2026-04-07 00:59:11.266444 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.manila.keyring) 2026-04-07 00:59:11.266449 | orchestrator | 2026-04-07 00:59:11.266453 | orchestrator | TASK [Check if target directories exist] *************************************** 2026-04-07 00:59:11.266457 | orchestrator | Tuesday 07 April 2026 00:57:59 +0000 (0:00:13.182) 0:00:23.751 ********* 2026-04-07 00:59:11.266462 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/infrastructure/files/ceph) 2026-04-07 00:59:11.266466 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-volume) 2026-04-07 00:59:11.266474 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2026-04-07 00:59:11.266479 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2026-04-07 00:59:11.266493 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2026-04-07 00:59:11.266500 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2026-04-07 00:59:11.266507 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/glance) 2026-04-07 00:59:11.266517 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/gnocchi) 2026-04-07 00:59:11.266525 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/manila) 2026-04-07 00:59:11.266532 | orchestrator | 2026-04-07 00:59:11.266539 | orchestrator | TASK [Write ceph keys to the configuration directory] ************************** 2026-04-07 00:59:11.266545 | orchestrator | Tuesday 07 April 2026 00:58:03 +0000 (0:00:03.238) 0:00:26.989 ********* 2026-04-07 00:59:11.266553 | orchestrator | changed: [testbed-manager] => (item=ceph.client.admin.keyring) 2026-04-07 00:59:11.266560 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266567 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266573 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder-backup.keyring) 2026-04-07 00:59:11.266580 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-04-07 00:59:11.266587 | orchestrator | changed: [testbed-manager] => (item=ceph.client.nova.keyring) 2026-04-07 00:59:11.266594 | orchestrator | changed: [testbed-manager] => (item=ceph.client.glance.keyring) 2026-04-07 00:59:11.266600 | orchestrator | changed: [testbed-manager] => (item=ceph.client.gnocchi.keyring) 2026-04-07 00:59:11.266607 | orchestrator | changed: [testbed-manager] => (item=ceph.client.manila.keyring) 2026-04-07 00:59:11.266613 | orchestrator | 2026-04-07 00:59:11.266619 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:59:11.266625 | orchestrator | testbed-manager : ok=6  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-07 00:59:11.266646 | orchestrator | 2026-04-07 00:59:11.266661 | orchestrator | 2026-04-07 00:59:11.266667 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:59:11.266674 | orchestrator | Tuesday 07 April 2026 00:58:09 +0000 (0:00:06.771) 0:00:33.760 ********* 2026-04-07 00:59:11.266680 | orchestrator | =============================================================================== 2026-04-07 00:59:11.266688 | orchestrator | Write ceph keys to the share directory --------------------------------- 13.18s 2026-04-07 00:59:11.266695 | orchestrator | Write ceph keys to the configuration directory -------------------------- 6.77s 2026-04-07 00:59:11.266701 | orchestrator | Check if ceph keys exist ------------------------------------------------ 4.96s 2026-04-07 00:59:11.266708 | orchestrator | Fetch all ceph keys ----------------------------------------------------- 4.31s 2026-04-07 00:59:11.266720 | orchestrator | Check if target directories exist --------------------------------------- 3.24s 2026-04-07 00:59:11.266727 | orchestrator | Create share directory -------------------------------------------------- 1.07s 2026-04-07 00:59:11.266734 | orchestrator | 2026-04-07 00:59:11.266741 | orchestrator | 2026-04-07 00:59:11.266748 | orchestrator | PLAY [Apply role cephclient] *************************************************** 2026-04-07 00:59:11.266755 | orchestrator | 2026-04-07 00:59:11.266763 | orchestrator | TASK [osism.services.cephclient : Include container tasks] ********************* 2026-04-07 00:59:11.266768 | orchestrator | Tuesday 07 April 2026 00:58:13 +0000 (0:00:00.300) 0:00:00.300 ********* 2026-04-07 00:59:11.266772 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/cephclient/tasks/container.yml for testbed-manager 2026-04-07 00:59:11.266783 | orchestrator | 2026-04-07 00:59:11.266788 | orchestrator | TASK [osism.services.cephclient : Create required directories] ***************** 2026-04-07 00:59:11.266792 | orchestrator | Tuesday 07 April 2026 00:58:13 +0000 (0:00:00.203) 0:00:00.504 ********* 2026-04-07 00:59:11.266800 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/configuration) 2026-04-07 00:59:11.266807 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/data) 2026-04-07 00:59:11.266819 | orchestrator | ok: [testbed-manager] => (item=/opt/cephclient) 2026-04-07 00:59:11.266827 | orchestrator | 2026-04-07 00:59:11.266833 | orchestrator | TASK [osism.services.cephclient : Copy configuration files] ******************** 2026-04-07 00:59:11.266840 | orchestrator | Tuesday 07 April 2026 00:58:15 +0000 (0:00:01.562) 0:00:02.067 ********* 2026-04-07 00:59:11.266848 | orchestrator | changed: [testbed-manager] => (item={'src': 'ceph.conf.j2', 'dest': '/opt/cephclient/configuration/ceph.conf'}) 2026-04-07 00:59:11.266855 | orchestrator | 2026-04-07 00:59:11.266862 | orchestrator | TASK [osism.services.cephclient : Copy keyring file] *************************** 2026-04-07 00:59:11.266868 | orchestrator | Tuesday 07 April 2026 00:58:16 +0000 (0:00:01.011) 0:00:03.078 ********* 2026-04-07 00:59:11.266970 | orchestrator | changed: [testbed-manager] 2026-04-07 00:59:11.266982 | orchestrator | 2026-04-07 00:59:11.266990 | orchestrator | TASK [osism.services.cephclient : Copy docker-compose.yml file] **************** 2026-04-07 00:59:11.266999 | orchestrator | Tuesday 07 April 2026 00:58:16 +0000 (0:00:00.783) 0:00:03.861 ********* 2026-04-07 00:59:11.267004 | orchestrator | changed: [testbed-manager] 2026-04-07 00:59:11.267009 | orchestrator | 2026-04-07 00:59:11.267014 | orchestrator | TASK [osism.services.cephclient : Manage cephclient service] ******************* 2026-04-07 00:59:11.267019 | orchestrator | Tuesday 07 April 2026 00:58:17 +0000 (0:00:00.791) 0:00:04.653 ********* 2026-04-07 00:59:11.267025 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage cephclient service (10 retries left). 2026-04-07 00:59:11.267030 | orchestrator | ok: [testbed-manager] 2026-04-07 00:59:11.267035 | orchestrator | 2026-04-07 00:59:11.267042 | orchestrator | TASK [osism.services.cephclient : Copy wrapper scripts] ************************ 2026-04-07 00:59:11.267060 | orchestrator | Tuesday 07 April 2026 00:58:59 +0000 (0:00:41.411) 0:00:46.065 ********* 2026-04-07 00:59:11.267068 | orchestrator | changed: [testbed-manager] => (item=ceph) 2026-04-07 00:59:11.267076 | orchestrator | changed: [testbed-manager] => (item=ceph-authtool) 2026-04-07 00:59:11.267084 | orchestrator | changed: [testbed-manager] => (item=rados) 2026-04-07 00:59:11.267091 | orchestrator | changed: [testbed-manager] => (item=radosgw-admin) 2026-04-07 00:59:11.267099 | orchestrator | changed: [testbed-manager] => (item=rbd) 2026-04-07 00:59:11.267106 | orchestrator | 2026-04-07 00:59:11.267114 | orchestrator | TASK [osism.services.cephclient : Remove old wrapper scripts] ****************** 2026-04-07 00:59:11.267122 | orchestrator | Tuesday 07 April 2026 00:59:03 +0000 (0:00:04.057) 0:00:50.122 ********* 2026-04-07 00:59:11.267128 | orchestrator | ok: [testbed-manager] => (item=crushtool) 2026-04-07 00:59:11.267133 | orchestrator | 2026-04-07 00:59:11.267138 | orchestrator | TASK [osism.services.cephclient : Include package tasks] *********************** 2026-04-07 00:59:11.267143 | orchestrator | Tuesday 07 April 2026 00:59:03 +0000 (0:00:00.576) 0:00:50.699 ********* 2026-04-07 00:59:11.267148 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:59:11.267153 | orchestrator | 2026-04-07 00:59:11.267157 | orchestrator | TASK [osism.services.cephclient : Include rook task] *************************** 2026-04-07 00:59:11.267162 | orchestrator | Tuesday 07 April 2026 00:59:03 +0000 (0:00:00.123) 0:00:50.822 ********* 2026-04-07 00:59:11.267166 | orchestrator | skipping: [testbed-manager] 2026-04-07 00:59:11.267171 | orchestrator | 2026-04-07 00:59:11.267175 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Restart cephclient service] ******* 2026-04-07 00:59:11.267180 | orchestrator | Tuesday 07 April 2026 00:59:04 +0000 (0:00:00.297) 0:00:51.119 ********* 2026-04-07 00:59:11.267184 | orchestrator | changed: [testbed-manager] 2026-04-07 00:59:11.267188 | orchestrator | 2026-04-07 00:59:11.267193 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Ensure that all containers are up] *** 2026-04-07 00:59:11.267231 | orchestrator | Tuesday 07 April 2026 00:59:05 +0000 (0:00:01.429) 0:00:52.549 ********* 2026-04-07 00:59:11.267240 | orchestrator | changed: [testbed-manager] 2026-04-07 00:59:11.267246 | orchestrator | 2026-04-07 00:59:11.267253 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Wait for an healthy service] ****** 2026-04-07 00:59:11.267259 | orchestrator | Tuesday 07 April 2026 00:59:06 +0000 (0:00:00.717) 0:00:53.267 ********* 2026-04-07 00:59:11.267266 | orchestrator | changed: [testbed-manager] 2026-04-07 00:59:11.267273 | orchestrator | 2026-04-07 00:59:11.267279 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Copy bash completion scripts] ***** 2026-04-07 00:59:11.267285 | orchestrator | Tuesday 07 April 2026 00:59:06 +0000 (0:00:00.563) 0:00:53.831 ********* 2026-04-07 00:59:11.267291 | orchestrator | ok: [testbed-manager] => (item=ceph) 2026-04-07 00:59:11.267298 | orchestrator | ok: [testbed-manager] => (item=rados) 2026-04-07 00:59:11.267305 | orchestrator | ok: [testbed-manager] => (item=radosgw-admin) 2026-04-07 00:59:11.267312 | orchestrator | ok: [testbed-manager] => (item=rbd) 2026-04-07 00:59:11.267319 | orchestrator | 2026-04-07 00:59:11.267327 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 00:59:11.267342 | orchestrator | testbed-manager : ok=12  changed=8  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-07 00:59:11.267349 | orchestrator | 2026-04-07 00:59:11.267354 | orchestrator | 2026-04-07 00:59:11.267358 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 00:59:11.267362 | orchestrator | Tuesday 07 April 2026 00:59:08 +0000 (0:00:01.491) 0:00:55.323 ********* 2026-04-07 00:59:11.267367 | orchestrator | =============================================================================== 2026-04-07 00:59:11.267371 | orchestrator | osism.services.cephclient : Manage cephclient service ------------------ 41.41s 2026-04-07 00:59:11.267376 | orchestrator | osism.services.cephclient : Copy wrapper scripts ------------------------ 4.06s 2026-04-07 00:59:11.267380 | orchestrator | osism.services.cephclient : Create required directories ----------------- 1.56s 2026-04-07 00:59:11.267385 | orchestrator | osism.services.cephclient : Copy bash completion scripts ---------------- 1.49s 2026-04-07 00:59:11.267392 | orchestrator | osism.services.cephclient : Restart cephclient service ------------------ 1.43s 2026-04-07 00:59:11.267403 | orchestrator | osism.services.cephclient : Copy configuration files -------------------- 1.01s 2026-04-07 00:59:11.267410 | orchestrator | osism.services.cephclient : Copy docker-compose.yml file ---------------- 0.79s 2026-04-07 00:59:11.267417 | orchestrator | osism.services.cephclient : Copy keyring file --------------------------- 0.78s 2026-04-07 00:59:11.267423 | orchestrator | osism.services.cephclient : Ensure that all containers are up ----------- 0.72s 2026-04-07 00:59:11.267429 | orchestrator | osism.services.cephclient : Remove old wrapper scripts ------------------ 0.58s 2026-04-07 00:59:11.267436 | orchestrator | osism.services.cephclient : Wait for an healthy service ----------------- 0.56s 2026-04-07 00:59:11.267442 | orchestrator | osism.services.cephclient : Include rook task --------------------------- 0.30s 2026-04-07 00:59:11.267449 | orchestrator | osism.services.cephclient : Include container tasks --------------------- 0.20s 2026-04-07 00:59:11.267455 | orchestrator | osism.services.cephclient : Include package tasks ----------------------- 0.12s 2026-04-07 00:59:11.267462 | orchestrator | 2026-04-07 00:59:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:11.269553 | orchestrator | 2026-04-07 00:59:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:14.308877 | orchestrator | 2026-04-07 00:59:14 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:14.310841 | orchestrator | 2026-04-07 00:59:14 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:14.313966 | orchestrator | 2026-04-07 00:59:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:14.315721 | orchestrator | 2026-04-07 00:59:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:14.315785 | orchestrator | 2026-04-07 00:59:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:17.359892 | orchestrator | 2026-04-07 00:59:17 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:17.361714 | orchestrator | 2026-04-07 00:59:17 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:17.363609 | orchestrator | 2026-04-07 00:59:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:17.365161 | orchestrator | 2026-04-07 00:59:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:17.365268 | orchestrator | 2026-04-07 00:59:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:20.399816 | orchestrator | 2026-04-07 00:59:20 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:20.404091 | orchestrator | 2026-04-07 00:59:20 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:20.405426 | orchestrator | 2026-04-07 00:59:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:20.406251 | orchestrator | 2026-04-07 00:59:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:20.406348 | orchestrator | 2026-04-07 00:59:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:23.459988 | orchestrator | 2026-04-07 00:59:23 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:23.462691 | orchestrator | 2026-04-07 00:59:23 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:23.463720 | orchestrator | 2026-04-07 00:59:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:23.465658 | orchestrator | 2026-04-07 00:59:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:23.465712 | orchestrator | 2026-04-07 00:59:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:26.505687 | orchestrator | 2026-04-07 00:59:26 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:26.518738 | orchestrator | 2026-04-07 00:59:26 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:26.518827 | orchestrator | 2026-04-07 00:59:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:26.518836 | orchestrator | 2026-04-07 00:59:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:26.518844 | orchestrator | 2026-04-07 00:59:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:29.546412 | orchestrator | 2026-04-07 00:59:29 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:29.547772 | orchestrator | 2026-04-07 00:59:29 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:29.549711 | orchestrator | 2026-04-07 00:59:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:29.551400 | orchestrator | 2026-04-07 00:59:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:29.551604 | orchestrator | 2026-04-07 00:59:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:32.593141 | orchestrator | 2026-04-07 00:59:32 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:32.593906 | orchestrator | 2026-04-07 00:59:32 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:32.594765 | orchestrator | 2026-04-07 00:59:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:32.597137 | orchestrator | 2026-04-07 00:59:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:32.597182 | orchestrator | 2026-04-07 00:59:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:35.639665 | orchestrator | 2026-04-07 00:59:35 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:35.642856 | orchestrator | 2026-04-07 00:59:35 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:35.644935 | orchestrator | 2026-04-07 00:59:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:35.647780 | orchestrator | 2026-04-07 00:59:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:35.647848 | orchestrator | 2026-04-07 00:59:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:38.695847 | orchestrator | 2026-04-07 00:59:38 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:38.697666 | orchestrator | 2026-04-07 00:59:38 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:38.698970 | orchestrator | 2026-04-07 00:59:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:38.700807 | orchestrator | 2026-04-07 00:59:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:38.700862 | orchestrator | 2026-04-07 00:59:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:41.733332 | orchestrator | 2026-04-07 00:59:41 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:41.733771 | orchestrator | 2026-04-07 00:59:41 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:41.735796 | orchestrator | 2026-04-07 00:59:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:41.736587 | orchestrator | 2026-04-07 00:59:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:41.736637 | orchestrator | 2026-04-07 00:59:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:44.784944 | orchestrator | 2026-04-07 00:59:44 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:44.787458 | orchestrator | 2026-04-07 00:59:44 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:44.789061 | orchestrator | 2026-04-07 00:59:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:44.792487 | orchestrator | 2026-04-07 00:59:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:44.792647 | orchestrator | 2026-04-07 00:59:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:47.844143 | orchestrator | 2026-04-07 00:59:47 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:47.845793 | orchestrator | 2026-04-07 00:59:47 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:47.847850 | orchestrator | 2026-04-07 00:59:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:47.849231 | orchestrator | 2026-04-07 00:59:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:47.849578 | orchestrator | 2026-04-07 00:59:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:50.893486 | orchestrator | 2026-04-07 00:59:50 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:50.894585 | orchestrator | 2026-04-07 00:59:50 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:50.896351 | orchestrator | 2026-04-07 00:59:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:50.897938 | orchestrator | 2026-04-07 00:59:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:50.898005 | orchestrator | 2026-04-07 00:59:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:53.935848 | orchestrator | 2026-04-07 00:59:53 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:53.936592 | orchestrator | 2026-04-07 00:59:53 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:53.937861 | orchestrator | 2026-04-07 00:59:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:53.939442 | orchestrator | 2026-04-07 00:59:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:53.939502 | orchestrator | 2026-04-07 00:59:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 00:59:56.969462 | orchestrator | 2026-04-07 00:59:56 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 00:59:56.970156 | orchestrator | 2026-04-07 00:59:56 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 00:59:56.971001 | orchestrator | 2026-04-07 00:59:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 00:59:56.971800 | orchestrator | 2026-04-07 00:59:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 00:59:56.971817 | orchestrator | 2026-04-07 00:59:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:00.008779 | orchestrator | 2026-04-07 01:00:00 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:00.009025 | orchestrator | 2026-04-07 01:00:00 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:00.009928 | orchestrator | 2026-04-07 01:00:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:00.012060 | orchestrator | 2026-04-07 01:00:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:00.012092 | orchestrator | 2026-04-07 01:00:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:03.047319 | orchestrator | 2026-04-07 01:00:03 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:03.048920 | orchestrator | 2026-04-07 01:00:03 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:03.049787 | orchestrator | 2026-04-07 01:00:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:03.051072 | orchestrator | 2026-04-07 01:00:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:03.051123 | orchestrator | 2026-04-07 01:00:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:06.092251 | orchestrator | 2026-04-07 01:00:06 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:06.093179 | orchestrator | 2026-04-07 01:00:06 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:06.094344 | orchestrator | 2026-04-07 01:00:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:06.095414 | orchestrator | 2026-04-07 01:00:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:06.095453 | orchestrator | 2026-04-07 01:00:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:09.136217 | orchestrator | 2026-04-07 01:00:09 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:09.137673 | orchestrator | 2026-04-07 01:00:09 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:09.137718 | orchestrator | 2026-04-07 01:00:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:09.138558 | orchestrator | 2026-04-07 01:00:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:09.138590 | orchestrator | 2026-04-07 01:00:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:12.180568 | orchestrator | 2026-04-07 01:00:12 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:12.181318 | orchestrator | 2026-04-07 01:00:12 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:12.182368 | orchestrator | 2026-04-07 01:00:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:12.184316 | orchestrator | 2026-04-07 01:00:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:12.184345 | orchestrator | 2026-04-07 01:00:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:15.242458 | orchestrator | 2026-04-07 01:00:15 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:15.244412 | orchestrator | 2026-04-07 01:00:15 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:15.245836 | orchestrator | 2026-04-07 01:00:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:15.247158 | orchestrator | 2026-04-07 01:00:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:15.247367 | orchestrator | 2026-04-07 01:00:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:18.288723 | orchestrator | 2026-04-07 01:00:18 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:18.289392 | orchestrator | 2026-04-07 01:00:18 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:18.290405 | orchestrator | 2026-04-07 01:00:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:18.291615 | orchestrator | 2026-04-07 01:00:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:18.292324 | orchestrator | 2026-04-07 01:00:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:21.331641 | orchestrator | 2026-04-07 01:00:21 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state STARTED 2026-04-07 01:00:21.333446 | orchestrator | 2026-04-07 01:00:21 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:21.335610 | orchestrator | 2026-04-07 01:00:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:21.337795 | orchestrator | 2026-04-07 01:00:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:21.337840 | orchestrator | 2026-04-07 01:00:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:24.381798 | orchestrator | 2026-04-07 01:00:24 | INFO  | Task c5300f78-dbb7-4cb6-b629-8810f3863e9b is in state SUCCESS 2026-04-07 01:00:24.383431 | orchestrator | 2026-04-07 01:00:24.383473 | orchestrator | 2026-04-07 01:00:24.383479 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 01:00:24.383496 | orchestrator | 2026-04-07 01:00:24.383500 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 01:00:24.383505 | orchestrator | Tuesday 07 April 2026 00:59:11 +0000 (0:00:00.283) 0:00:00.283 ********* 2026-04-07 01:00:24.383509 | orchestrator | ok: [testbed-manager] 2026-04-07 01:00:24.383513 | orchestrator | ok: [testbed-node-0] 2026-04-07 01:00:24.383517 | orchestrator | ok: [testbed-node-1] 2026-04-07 01:00:24.383521 | orchestrator | ok: [testbed-node-2] 2026-04-07 01:00:24.383525 | orchestrator | ok: [testbed-node-3] 2026-04-07 01:00:24.383529 | orchestrator | ok: [testbed-node-4] 2026-04-07 01:00:24.383533 | orchestrator | ok: [testbed-node-5] 2026-04-07 01:00:24.383537 | orchestrator | 2026-04-07 01:00:24.383541 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 01:00:24.383545 | orchestrator | Tuesday 07 April 2026 00:59:12 +0000 (0:00:00.741) 0:00:01.024 ********* 2026-04-07 01:00:24.383549 | orchestrator | ok: [testbed-manager] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383553 | orchestrator | ok: [testbed-node-0] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383557 | orchestrator | ok: [testbed-node-1] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383561 | orchestrator | ok: [testbed-node-2] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383565 | orchestrator | ok: [testbed-node-3] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383569 | orchestrator | ok: [testbed-node-4] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383573 | orchestrator | ok: [testbed-node-5] => (item=enable_prometheus_True) 2026-04-07 01:00:24.383576 | orchestrator | 2026-04-07 01:00:24.383580 | orchestrator | PLAY [Apply role prometheus] *************************************************** 2026-04-07 01:00:24.383584 | orchestrator | 2026-04-07 01:00:24.383588 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2026-04-07 01:00:24.383592 | orchestrator | Tuesday 07 April 2026 00:59:13 +0000 (0:00:00.790) 0:00:01.815 ********* 2026-04-07 01:00:24.383603 | orchestrator | included: /ansible/roles/prometheus/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 01:00:24.383608 | orchestrator | 2026-04-07 01:00:24.383613 | orchestrator | TASK [prometheus : Ensuring config directories exist] ************************** 2026-04-07 01:00:24.383616 | orchestrator | Tuesday 07 April 2026 00:59:14 +0000 (0:00:01.129) 0:00:02.944 ********* 2026-04-07 01:00:24.383623 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 01:00:24.383629 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383637 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383650 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383654 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383658 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383764 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383771 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383776 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383780 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383883 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:24.383890 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383897 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383902 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383906 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383910 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383917 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.383921 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383928 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383932 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383938 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383942 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383946 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383954 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383959 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383965 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383970 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383976 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.383980 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.383984 | orchestrator | 2026-04-07 01:00:24.383988 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2026-04-07 01:00:24.383992 | orchestrator | Tuesday 07 April 2026 00:59:17 +0000 (0:00:03.310) 0:00:06.255 ********* 2026-04-07 01:00:24.383996 | orchestrator | included: /ansible/roles/prometheus/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-07 01:00:24.384000 | orchestrator | 2026-04-07 01:00:24.384004 | orchestrator | TASK [service-cert-copy : prometheus | Copying over extra CA certificates] ***** 2026-04-07 01:00:24.384011 | orchestrator | Tuesday 07 April 2026 00:59:18 +0000 (0:00:01.284) 0:00:07.540 ********* 2026-04-07 01:00:24.384016 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 01:00:24.384020 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384027 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384031 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384037 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384041 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384048 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384052 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.384056 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384062 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384066 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384070 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384079 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384084 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384091 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384095 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384099 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384106 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384110 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384114 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384120 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384128 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:24.384132 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384136 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384143 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.384164 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384171 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384178 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384182 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.384186 | orchestrator | 2026-04-07 01:00:24.384190 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS certificate] *** 2026-04-07 01:00:24.384195 | orchestrator | Tuesday 07 April 2026 00:59:25 +0000 (0:00:06.107) 0:00:13.647 ********* 2026-04-07 01:00:24.384199 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-07 01:00:24.384206 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384210 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384216 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384300 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384311 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384315 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.384319 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384326 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384330 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384337 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:24.384345 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384349 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384354 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384357 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.384362 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384366 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.384372 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384485 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384493 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384499 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384504 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384508 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384512 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384516 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.384521 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384531 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384543 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.384551 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384564 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384570 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384576 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.384582 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384589 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384595 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.384601 | orchestrator | 2026-04-07 01:00:24.384606 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS key] *** 2026-04-07 01:00:24.384612 | orchestrator | Tuesday 07 April 2026 00:59:27 +0000 (0:00:02.091) 0:00:15.738 ********* 2026-04-07 01:00:24.384619 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384631 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-07 01:00:24.384649 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384655 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384662 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384668 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384673 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384684 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384694 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384701 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.384707 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384713 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.384719 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384726 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.384733 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.384739 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.384768 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:24.385035 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.385053 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.385058 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.385062 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.385066 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385071 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.385075 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.385089 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.385093 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385097 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.385101 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385108 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385112 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385116 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.385120 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.385125 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.385129 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385136 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.385140 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.385144 | orchestrator | 2026-04-07 01:00:24.385152 | orchestrator | TASK [prometheus : Copying over config.json files] ***************************** 2026-04-07 01:00:24.385156 | orchestrator | Tuesday 07 April 2026 00:59:29 +0000 (0:00:02.545) 0:00:18.284 ********* 2026-04-07 01:00:24.385160 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 01:00:24.385167 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385172 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385176 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385180 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385187 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385194 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385199 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.385204 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385213 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385219 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385311 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385322 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385327 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385335 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385339 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385347 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385351 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385355 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385362 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:24.385369 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385374 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385378 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385384 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385388 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385392 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.385400 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385405 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385412 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.385416 | orchestrator | 2026-04-07 01:00:24.385420 | orchestrator | TASK [prometheus : Find custom prometheus alert rules files] ******************* 2026-04-07 01:00:24.385424 | orchestrator | Tuesday 07 April 2026 00:59:36 +0000 (0:00:06.605) 0:00:24.890 ********* 2026-04-07 01:00:24.385429 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 01:00:24.385432 | orchestrator | 2026-04-07 01:00:24.385436 | orchestrator | TASK [prometheus : Copying over custom prometheus alert rules files] *********** 2026-04-07 01:00:24.385440 | orchestrator | Tuesday 07 April 2026 00:59:37 +0000 (0:00:00.921) 0:00:25.811 ********* 2026-04-07 01:00:24.385444 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.385448 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.385452 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.385456 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.385460 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.385463 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.385467 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.385471 | orchestrator | 2026-04-07 01:00:24.385568 | orchestrator | TASK [prometheus : Find prometheus common config overrides] ******************** 2026-04-07 01:00:24.385574 | orchestrator | Tuesday 07 April 2026 00:59:37 +0000 (0:00:00.781) 0:00:26.592 ********* 2026-04-07 01:00:24.385578 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 01:00:24.385582 | orchestrator | 2026-04-07 01:00:24.385586 | orchestrator | TASK [prometheus : Find prometheus host config overrides] ********************** 2026-04-07 01:00:24.385592 | orchestrator | Tuesday 07 April 2026 00:59:38 +0000 (0:00:00.807) 0:00:27.400 ********* 2026-04-07 01:00:24.385597 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385601 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385605 | orchestrator | manager/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385609 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385617 | orchestrator | manager/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385621 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 01:00:24.385624 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385628 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385632 | orchestrator | node-0/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385636 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385640 | orchestrator | node-0/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385644 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 01:00:24.385647 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385651 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385655 | orchestrator | node-3/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385659 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385663 | orchestrator | node-3/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385667 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-04-07 01:00:24.385670 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385674 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385678 | orchestrator | node-1/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385682 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385686 | orchestrator | node-1/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385690 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-07 01:00:24.385694 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385697 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385701 | orchestrator | node-4/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385705 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385709 | orchestrator | node-4/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385713 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-04-07 01:00:24.385716 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385720 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385724 | orchestrator | node-2/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385728 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385732 | orchestrator | node-2/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385756 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-07 01:00:24.385761 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.385765 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385768 | orchestrator | node-5/prometheus.yml.d' path due to this access issue: 2026-04-07 01:00:24.385772 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-07 01:00:24.385777 | orchestrator | node-5/prometheus.yml.d' is not a directory 2026-04-07 01:00:24.385781 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-04-07 01:00:24.385784 | orchestrator | 2026-04-07 01:00:24.385788 | orchestrator | TASK [prometheus : Copying over prometheus config file] ************************ 2026-04-07 01:00:24.385803 | orchestrator | Tuesday 07 April 2026 00:59:40 +0000 (0:00:01.679) 0:00:29.080 ********* 2026-04-07 01:00:24.385808 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-07 01:00:24.385812 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.385816 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-07 01:00:24.385821 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.385827 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-07 01:00:24.385831 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.385835 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-07 01:00:24.385839 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.385843 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-07 01:00:24.385847 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-07 01:00:24.385851 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.385855 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.385859 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2) 2026-04-07 01:00:24.385863 | orchestrator | 2026-04-07 01:00:24.385866 | orchestrator | TASK [prometheus : Copying over prometheus web config file] ******************** 2026-04-07 01:00:24.385870 | orchestrator | Tuesday 07 April 2026 00:59:54 +0000 (0:00:13.545) 0:00:42.625 ********* 2026-04-07 01:00:24.385874 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-07 01:00:24.385878 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.385884 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-07 01:00:24.385888 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.385892 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-07 01:00:24.385896 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.385900 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-07 01:00:24.385904 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.385907 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-07 01:00:24.385911 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-07 01:00:24.385915 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.385919 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.385923 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2) 2026-04-07 01:00:24.385927 | orchestrator | 2026-04-07 01:00:24.385931 | orchestrator | TASK [prometheus : Copying over prometheus alertmanager config file] *********** 2026-04-07 01:00:24.385935 | orchestrator | Tuesday 07 April 2026 00:59:57 +0000 (0:00:03.033) 0:00:45.659 ********* 2026-04-07 01:00:24.385939 | orchestrator | skipping: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-07 01:00:24.385943 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.385947 | orchestrator | skipping: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-07 01:00:24.385950 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.385954 | orchestrator | skipping: [testbed-node-3] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-07 01:00:24.385959 | orchestrator | skipping: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-07 01:00:24.385962 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.385966 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.385970 | orchestrator | skipping: [testbed-node-4] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-07 01:00:24.385974 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.385978 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml) 2026-04-07 01:00:24.385985 | orchestrator | skipping: [testbed-node-5] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-07 01:00:24.385989 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.385993 | orchestrator | 2026-04-07 01:00:24.385997 | orchestrator | TASK [prometheus : Find custom Alertmanager alert notification templates] ****** 2026-04-07 01:00:24.386000 | orchestrator | Tuesday 07 April 2026 00:59:58 +0000 (0:00:01.440) 0:00:47.100 ********* 2026-04-07 01:00:24.386004 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 01:00:24.386008 | orchestrator | 2026-04-07 01:00:24.386047 | orchestrator | TASK [prometheus : Copying over custom Alertmanager alert notification templates] *** 2026-04-07 01:00:24.386053 | orchestrator | Tuesday 07 April 2026 00:59:59 +0000 (0:00:00.672) 0:00:47.773 ********* 2026-04-07 01:00:24.386057 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386061 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.386065 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.386069 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.386073 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386076 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386083 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386087 | orchestrator | 2026-04-07 01:00:24.386091 | orchestrator | TASK [prometheus : Copying over my.cnf for mysqld_exporter] ******************** 2026-04-07 01:00:24.386095 | orchestrator | Tuesday 07 April 2026 00:59:59 +0000 (0:00:00.659) 0:00:48.433 ********* 2026-04-07 01:00:24.386099 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386103 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386107 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386110 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386114 | orchestrator | changed: [testbed-node-0] 2026-04-07 01:00:24.386118 | orchestrator | changed: [testbed-node-1] 2026-04-07 01:00:24.386122 | orchestrator | changed: [testbed-node-2] 2026-04-07 01:00:24.386126 | orchestrator | 2026-04-07 01:00:24.386130 | orchestrator | TASK [prometheus : Copying cloud config file for openstack exporter] *********** 2026-04-07 01:00:24.386134 | orchestrator | Tuesday 07 April 2026 01:00:01 +0000 (0:00:01.780) 0:00:50.214 ********* 2026-04-07 01:00:24.386137 | orchestrator | skipping: [testbed-manager] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386141 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386145 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386149 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.386153 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386157 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.386161 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386164 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.386168 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386172 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386178 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386182 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386186 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-07 01:00:24.386190 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386194 | orchestrator | 2026-04-07 01:00:24.386198 | orchestrator | TASK [prometheus : Copying config file for blackbox exporter] ****************** 2026-04-07 01:00:24.386201 | orchestrator | Tuesday 07 April 2026 01:00:02 +0000 (0:00:01.167) 0:00:51.381 ********* 2026-04-07 01:00:24.386205 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-07 01:00:24.386212 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.386216 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-07 01:00:24.386220 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.386251 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-07 01:00:24.386258 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.386264 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-07 01:00:24.386268 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386272 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-07 01:00:24.386276 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386280 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2) 2026-04-07 01:00:24.386284 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-07 01:00:24.386287 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386291 | orchestrator | 2026-04-07 01:00:24.386295 | orchestrator | TASK [prometheus : Find extra prometheus server config files] ****************** 2026-04-07 01:00:24.386299 | orchestrator | Tuesday 07 April 2026 01:00:04 +0000 (0:00:01.592) 0:00:52.973 ********* 2026-04-07 01:00:24.386303 | orchestrator | [WARNING]: Skipped 2026-04-07 01:00:24.386307 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/extras/' path 2026-04-07 01:00:24.386310 | orchestrator | due to this access issue: 2026-04-07 01:00:24.386314 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/extras/' is 2026-04-07 01:00:24.386318 | orchestrator | not a directory 2026-04-07 01:00:24.386322 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-07 01:00:24.386326 | orchestrator | 2026-04-07 01:00:24.386330 | orchestrator | TASK [prometheus : Create subdirectories for extra config files] *************** 2026-04-07 01:00:24.386334 | orchestrator | Tuesday 07 April 2026 01:00:05 +0000 (0:00:01.135) 0:00:54.109 ********* 2026-04-07 01:00:24.386338 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386341 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.386345 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.386349 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.386353 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386357 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386361 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386364 | orchestrator | 2026-04-07 01:00:24.386368 | orchestrator | TASK [prometheus : Template extra prometheus server config files] ************** 2026-04-07 01:00:24.386372 | orchestrator | Tuesday 07 April 2026 01:00:06 +0000 (0:00:00.650) 0:00:54.759 ********* 2026-04-07 01:00:24.386376 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386380 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.386384 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.386388 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.386391 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386398 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386402 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386406 | orchestrator | 2026-04-07 01:00:24.386410 | orchestrator | TASK [service-check-containers : prometheus | Check containers] **************** 2026-04-07 01:00:24.386414 | orchestrator | Tuesday 07 April 2026 01:00:06 +0000 (0:00:00.791) 0:00:55.551 ********* 2026-04-07 01:00:24.386418 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386428 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-07 01:00:24.386433 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386437 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386441 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386445 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386452 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386459 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-07 01:00:24.386465 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386469 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386473 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386477 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386481 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386488 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386492 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386499 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386505 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386509 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386513 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386517 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386521 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386528 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:24.386537 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386545 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386549 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-07 01:00:24.386553 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386557 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386561 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386571 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-07 01:00:24.386575 | orchestrator | 2026-04-07 01:00:24.386579 | orchestrator | TASK [service-check-containers : prometheus | Notify handlers to restart containers] *** 2026-04-07 01:00:24.386583 | orchestrator | Tuesday 07 April 2026 01:00:10 +0000 (0:00:03.841) 0:00:59.392 ********* 2026-04-07 01:00:24.386587 | orchestrator | changed: [testbed-manager] => { 2026-04-07 01:00:24.386591 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386595 | orchestrator | } 2026-04-07 01:00:24.386599 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 01:00:24.386603 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386607 | orchestrator | } 2026-04-07 01:00:24.386611 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 01:00:24.386614 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386618 | orchestrator | } 2026-04-07 01:00:24.386622 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 01:00:24.386626 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386630 | orchestrator | } 2026-04-07 01:00:24.386634 | orchestrator | changed: [testbed-node-3] => { 2026-04-07 01:00:24.386637 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386641 | orchestrator | } 2026-04-07 01:00:24.386645 | orchestrator | changed: [testbed-node-4] => { 2026-04-07 01:00:24.386649 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386653 | orchestrator | } 2026-04-07 01:00:24.386657 | orchestrator | changed: [testbed-node-5] => { 2026-04-07 01:00:24.386660 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:24.386664 | orchestrator | } 2026-04-07 01:00:24.386668 | orchestrator | 2026-04-07 01:00:24.386674 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 01:00:24.386678 | orchestrator | Tuesday 07 April 2026 01:00:11 +0000 (0:00:00.842) 0:01:00.235 ********* 2026-04-07 01:00:24.386682 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-07 01:00:24.386687 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386694 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386701 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:24.386705 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386712 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386716 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386720 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386724 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386731 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386738 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386742 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386746 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386752 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386757 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386761 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386765 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386771 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386775 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386782 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386786 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-07 01:00:24.386790 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:24.386794 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:24.386798 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:24.386804 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386808 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386812 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386819 | orchestrator | skipping: [testbed-node-3] 2026-04-07 01:00:24.386823 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386827 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386833 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386838 | orchestrator | skipping: [testbed-node-4] 2026-04-07 01:00:24.386842 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-07 01:00:24.386848 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386852 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-07 01:00:24.386856 | orchestrator | skipping: [testbed-node-5] 2026-04-07 01:00:24.386862 | orchestrator | 2026-04-07 01:00:24.386866 | orchestrator | TASK [prometheus : Creating prometheus database user and setting permissions] *** 2026-04-07 01:00:24.386870 | orchestrator | Tuesday 07 April 2026 01:00:13 +0000 (0:00:01.907) 0:01:02.143 ********* 2026-04-07 01:00:24.386874 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-04-07 01:00:24.386878 | orchestrator | skipping: [testbed-manager] 2026-04-07 01:00:24.386882 | orchestrator | 2026-04-07 01:00:24.386886 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386890 | orchestrator | Tuesday 07 April 2026 01:00:14 +0000 (0:00:01.139) 0:01:03.282 ********* 2026-04-07 01:00:24.386894 | orchestrator | 2026-04-07 01:00:24.386898 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386901 | orchestrator | Tuesday 07 April 2026 01:00:14 +0000 (0:00:00.076) 0:01:03.359 ********* 2026-04-07 01:00:24.386905 | orchestrator | 2026-04-07 01:00:24.386909 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386913 | orchestrator | Tuesday 07 April 2026 01:00:14 +0000 (0:00:00.229) 0:01:03.589 ********* 2026-04-07 01:00:24.386917 | orchestrator | 2026-04-07 01:00:24.386921 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386925 | orchestrator | Tuesday 07 April 2026 01:00:15 +0000 (0:00:00.063) 0:01:03.652 ********* 2026-04-07 01:00:24.386929 | orchestrator | 2026-04-07 01:00:24.386933 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386936 | orchestrator | Tuesday 07 April 2026 01:00:15 +0000 (0:00:00.063) 0:01:03.716 ********* 2026-04-07 01:00:24.386940 | orchestrator | 2026-04-07 01:00:24.386944 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386948 | orchestrator | Tuesday 07 April 2026 01:00:15 +0000 (0:00:00.058) 0:01:03.774 ********* 2026-04-07 01:00:24.386952 | orchestrator | 2026-04-07 01:00:24.386956 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-07 01:00:24.386959 | orchestrator | Tuesday 07 April 2026 01:00:15 +0000 (0:00:00.063) 0:01:03.838 ********* 2026-04-07 01:00:24.386963 | orchestrator | 2026-04-07 01:00:24.386967 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-server container] ************* 2026-04-07 01:00:24.386971 | orchestrator | Tuesday 07 April 2026 01:00:15 +0000 (0:00:00.087) 0:01:03.925 ********* 2026-04-07 01:00:24.386981 | orchestrator | fatal: [testbed-manager]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=3.2.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_ya3i2_hc/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_ya3i2_hc/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_ya3i2_hc/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_ya3i2_hc/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=3.2.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-server: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.386988 | orchestrator | 2026-04-07 01:00:24.386992 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-node-exporter container] ****** 2026-04-07 01:00:24.386996 | orchestrator | Tuesday 07 April 2026 01:00:17 +0000 (0:00:02.323) 0:01:06.249 ********* 2026-04-07 01:00:24.387003 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_usw11i8q/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_usw11i8q/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_usw11i8q/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_usw11i8q/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.387010 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_wkc0ceq9/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_wkc0ceq9/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_wkc0ceq9/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_wkc0ceq9/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.387022 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_5wtn58z4/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_5wtn58z4/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_5wtn58z4/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_5wtn58z4/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.387032 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_anav3c2w/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_anav3c2w/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_anav3c2w/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_anav3c2w/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.387042 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_jiv6debl/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_jiv6debl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_jiv6debl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_jiv6debl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.387052 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_hh7852nb/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_hh7852nb/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_hh7852nb/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_hh7852nb/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 400 Client Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F%2Fprometheus-node-exporter: Bad Request (\"invalid reference format\")\\n'"} 2026-04-07 01:00:24.387056 | orchestrator | 2026-04-07 01:00:24.387060 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 01:00:24.387064 | orchestrator | testbed-manager : ok=18  changed=9  unreachable=0 failed=1  skipped=10  rescued=0 ignored=0 2026-04-07 01:00:24.387068 | orchestrator | testbed-node-0 : ok=11  changed=6  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-07 01:00:24.387075 | orchestrator | testbed-node-1 : ok=11  changed=6  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-07 01:00:24.387079 | orchestrator | testbed-node-2 : ok=11  changed=6  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-07 01:00:24.387085 | orchestrator | testbed-node-3 : ok=10  changed=5  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-04-07 01:00:24.387088 | orchestrator | testbed-node-4 : ok=10  changed=5  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-04-07 01:00:24.387092 | orchestrator | testbed-node-5 : ok=10  changed=5  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-04-07 01:00:24.387096 | orchestrator | 2026-04-07 01:00:24.387100 | orchestrator | 2026-04-07 01:00:24.387104 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 01:00:24.387108 | orchestrator | Tuesday 07 April 2026 01:00:21 +0000 (0:00:03.875) 0:01:10.124 ********* 2026-04-07 01:00:24.387112 | orchestrator | =============================================================================== 2026-04-07 01:00:24.387116 | orchestrator | prometheus : Copying over prometheus config file ----------------------- 13.55s 2026-04-07 01:00:24.387120 | orchestrator | prometheus : Copying over config.json files ----------------------------- 6.61s 2026-04-07 01:00:24.387123 | orchestrator | service-cert-copy : prometheus | Copying over extra CA certificates ----- 6.11s 2026-04-07 01:00:24.387127 | orchestrator | prometheus : Restart prometheus-node-exporter container ----------------- 3.88s 2026-04-07 01:00:24.387131 | orchestrator | service-check-containers : prometheus | Check containers ---------------- 3.84s 2026-04-07 01:00:24.387135 | orchestrator | prometheus : Ensuring config directories exist -------------------------- 3.31s 2026-04-07 01:00:24.387139 | orchestrator | prometheus : Copying over prometheus web config file -------------------- 3.03s 2026-04-07 01:00:24.387142 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS key --- 2.55s 2026-04-07 01:00:24.387146 | orchestrator | prometheus : Restart prometheus-server container ------------------------ 2.32s 2026-04-07 01:00:24.387150 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS certificate --- 2.09s 2026-04-07 01:00:24.387154 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.91s 2026-04-07 01:00:24.387158 | orchestrator | prometheus : Copying over my.cnf for mysqld_exporter -------------------- 1.78s 2026-04-07 01:00:24.387162 | orchestrator | prometheus : Find prometheus host config overrides ---------------------- 1.68s 2026-04-07 01:00:24.387166 | orchestrator | prometheus : Copying config file for blackbox exporter ------------------ 1.59s 2026-04-07 01:00:24.387169 | orchestrator | prometheus : Copying over prometheus alertmanager config file ----------- 1.44s 2026-04-07 01:00:24.387173 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.28s 2026-04-07 01:00:24.387177 | orchestrator | prometheus : Copying cloud config file for openstack exporter ----------- 1.17s 2026-04-07 01:00:24.387181 | orchestrator | prometheus : Creating prometheus database user and setting permissions --- 1.14s 2026-04-07 01:00:24.387185 | orchestrator | prometheus : Find extra prometheus server config files ------------------ 1.14s 2026-04-07 01:00:24.387188 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.13s 2026-04-07 01:00:24.387192 | orchestrator | 2026-04-07 01:00:24 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:24.388431 | orchestrator | 2026-04-07 01:00:24 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state STARTED 2026-04-07 01:00:24.391783 | orchestrator | 2026-04-07 01:00:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:24.393880 | orchestrator | 2026-04-07 01:00:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:24.394002 | orchestrator | 2026-04-07 01:00:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:27.438996 | orchestrator | 2026-04-07 01:00:27 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:27.443409 | orchestrator | 2026-04-07 01:00:27 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state STARTED 2026-04-07 01:00:27.445715 | orchestrator | 2026-04-07 01:00:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:27.447951 | orchestrator | 2026-04-07 01:00:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:27.448015 | orchestrator | 2026-04-07 01:00:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:30.485833 | orchestrator | 2026-04-07 01:00:30 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:30.487611 | orchestrator | 2026-04-07 01:00:30 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state STARTED 2026-04-07 01:00:30.488196 | orchestrator | 2026-04-07 01:00:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:30.489161 | orchestrator | 2026-04-07 01:00:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:30.489335 | orchestrator | 2026-04-07 01:00:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:33.530580 | orchestrator | 2026-04-07 01:00:33 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:33.531363 | orchestrator | 2026-04-07 01:00:33 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state STARTED 2026-04-07 01:00:33.532719 | orchestrator | 2026-04-07 01:00:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:33.533814 | orchestrator | 2026-04-07 01:00:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:33.533850 | orchestrator | 2026-04-07 01:00:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:36.577132 | orchestrator | 2026-04-07 01:00:36 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:36.578984 | orchestrator | 2026-04-07 01:00:36 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state STARTED 2026-04-07 01:00:36.580432 | orchestrator | 2026-04-07 01:00:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:36.581846 | orchestrator | 2026-04-07 01:00:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:36.581982 | orchestrator | 2026-04-07 01:00:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:39.624801 | orchestrator | 2026-04-07 01:00:39 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:39.625567 | orchestrator | 2026-04-07 01:00:39 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state STARTED 2026-04-07 01:00:39.626327 | orchestrator | 2026-04-07 01:00:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:39.627561 | orchestrator | 2026-04-07 01:00:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:39.629580 | orchestrator | 2026-04-07 01:00:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:42.676976 | orchestrator | 2026-04-07 01:00:42 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:42.678454 | orchestrator | 2026-04-07 01:00:42 | INFO  | Task 78717500-da24-469a-9007-123b4112bb04 is in state SUCCESS 2026-04-07 01:00:42.679904 | orchestrator | 2026-04-07 01:00:42.679940 | orchestrator | 2026-04-07 01:00:42.679945 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-07 01:00:42.679951 | orchestrator | 2026-04-07 01:00:42.679955 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-07 01:00:42.679959 | orchestrator | Tuesday 07 April 2026 01:00:24 +0000 (0:00:00.268) 0:00:00.268 ********* 2026-04-07 01:00:42.679963 | orchestrator | ok: [testbed-node-0] 2026-04-07 01:00:42.679968 | orchestrator | ok: [testbed-node-1] 2026-04-07 01:00:42.679972 | orchestrator | ok: [testbed-node-2] 2026-04-07 01:00:42.679976 | orchestrator | 2026-04-07 01:00:42.679980 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-07 01:00:42.679985 | orchestrator | Tuesday 07 April 2026 01:00:24 +0000 (0:00:00.254) 0:00:00.522 ********* 2026-04-07 01:00:42.679989 | orchestrator | ok: [testbed-node-0] => (item=enable_grafana_True) 2026-04-07 01:00:42.679994 | orchestrator | ok: [testbed-node-1] => (item=enable_grafana_True) 2026-04-07 01:00:42.679998 | orchestrator | ok: [testbed-node-2] => (item=enable_grafana_True) 2026-04-07 01:00:42.680001 | orchestrator | 2026-04-07 01:00:42.680005 | orchestrator | PLAY [Apply role grafana] ****************************************************** 2026-04-07 01:00:42.680009 | orchestrator | 2026-04-07 01:00:42.680013 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2026-04-07 01:00:42.680017 | orchestrator | Tuesday 07 April 2026 01:00:25 +0000 (0:00:00.251) 0:00:00.774 ********* 2026-04-07 01:00:42.680021 | orchestrator | included: /ansible/roles/grafana/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 01:00:42.680026 | orchestrator | 2026-04-07 01:00:42.680030 | orchestrator | TASK [grafana : Ensuring config directories exist] ***************************** 2026-04-07 01:00:42.680034 | orchestrator | Tuesday 07 April 2026 01:00:25 +0000 (0:00:00.507) 0:00:01.281 ********* 2026-04-07 01:00:42.680040 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680060 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680065 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680081 | orchestrator | 2026-04-07 01:00:42.680086 | orchestrator | TASK [grafana : Check if extra configuration file exists] ********************** 2026-04-07 01:00:42.680089 | orchestrator | Tuesday 07 April 2026 01:00:26 +0000 (0:00:01.029) 0:00:02.311 ********* 2026-04-07 01:00:42.680093 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 01:00:42.680099 | orchestrator | 2026-04-07 01:00:42.680102 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2026-04-07 01:00:42.680106 | orchestrator | Tuesday 07 April 2026 01:00:27 +0000 (0:00:00.754) 0:00:03.065 ********* 2026-04-07 01:00:42.680110 | orchestrator | included: /ansible/roles/grafana/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-07 01:00:42.680114 | orchestrator | 2026-04-07 01:00:42.680118 | orchestrator | TASK [service-cert-copy : grafana | Copying over extra CA certificates] ******** 2026-04-07 01:00:42.680129 | orchestrator | Tuesday 07 April 2026 01:00:27 +0000 (0:00:00.417) 0:00:03.483 ********* 2026-04-07 01:00:42.680133 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680137 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680144 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680148 | orchestrator | 2026-04-07 01:00:42.680152 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS certificate] *** 2026-04-07 01:00:42.680156 | orchestrator | Tuesday 07 April 2026 01:00:29 +0000 (0:00:01.449) 0:00:04.933 ********* 2026-04-07 01:00:42.680160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680168 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:42.680175 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680179 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:42.680183 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680187 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:42.680191 | orchestrator | 2026-04-07 01:00:42.680195 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS key] ***** 2026-04-07 01:00:42.680199 | orchestrator | Tuesday 07 April 2026 01:00:29 +0000 (0:00:00.441) 0:00:05.374 ********* 2026-04-07 01:00:42.680203 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680207 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:42.680214 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680222 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:42.680226 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680230 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:42.680233 | orchestrator | 2026-04-07 01:00:42.680309 | orchestrator | TASK [grafana : Copying over config.json files] ******************************** 2026-04-07 01:00:42.680314 | orchestrator | Tuesday 07 April 2026 01:00:30 +0000 (0:00:00.607) 0:00:05.982 ********* 2026-04-07 01:00:42.680324 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680329 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680334 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680338 | orchestrator | 2026-04-07 01:00:42.680342 | orchestrator | TASK [grafana : Copying over grafana.ini] ************************************** 2026-04-07 01:00:42.680346 | orchestrator | Tuesday 07 April 2026 01:00:31 +0000 (0:00:01.281) 0:00:07.263 ********* 2026-04-07 01:00:42.680359 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680376 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680388 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680401 | orchestrator | 2026-04-07 01:00:42.680407 | orchestrator | TASK [grafana : Copying over extra configuration file] ************************* 2026-04-07 01:00:42.680413 | orchestrator | Tuesday 07 April 2026 01:00:32 +0000 (0:00:01.401) 0:00:08.664 ********* 2026-04-07 01:00:42.680419 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:42.680424 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:42.680430 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:42.680436 | orchestrator | 2026-04-07 01:00:42.680442 | orchestrator | TASK [grafana : Configuring Prometheus as data source for Grafana] ************* 2026-04-07 01:00:42.680448 | orchestrator | Tuesday 07 April 2026 01:00:33 +0000 (0:00:00.277) 0:00:08.942 ********* 2026-04-07 01:00:42.680525 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-04-07 01:00:42.680533 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-04-07 01:00:42.680537 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-04-07 01:00:42.680542 | orchestrator | 2026-04-07 01:00:42.680546 | orchestrator | TASK [grafana : Configuring dashboards provisioning] *************************** 2026-04-07 01:00:42.680551 | orchestrator | Tuesday 07 April 2026 01:00:34 +0000 (0:00:01.118) 0:00:10.061 ********* 2026-04-07 01:00:42.680555 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-04-07 01:00:42.680560 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-04-07 01:00:42.680571 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-04-07 01:00:42.680576 | orchestrator | 2026-04-07 01:00:42.680581 | orchestrator | TASK [grafana : Check if the folder for custom grafana dashboards exists] ****** 2026-04-07 01:00:42.680585 | orchestrator | Tuesday 07 April 2026 01:00:35 +0000 (0:00:01.175) 0:00:11.236 ********* 2026-04-07 01:00:42.680590 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-07 01:00:42.680594 | orchestrator | 2026-04-07 01:00:42.680599 | orchestrator | TASK [grafana : Remove templated Grafana dashboards] *************************** 2026-04-07 01:00:42.680604 | orchestrator | Tuesday 07 April 2026 01:00:36 +0000 (0:00:00.669) 0:00:11.906 ********* 2026-04-07 01:00:42.680609 | orchestrator | ok: [testbed-node-0] 2026-04-07 01:00:42.680613 | orchestrator | ok: [testbed-node-1] 2026-04-07 01:00:42.680618 | orchestrator | ok: [testbed-node-2] 2026-04-07 01:00:42.680622 | orchestrator | 2026-04-07 01:00:42.680627 | orchestrator | TASK [grafana : Copying over custom dashboards] ******************************** 2026-04-07 01:00:42.680632 | orchestrator | Tuesday 07 April 2026 01:00:37 +0000 (0:00:00.804) 0:00:12.710 ********* 2026-04-07 01:00:42.680637 | orchestrator | changed: [testbed-node-0] 2026-04-07 01:00:42.680645 | orchestrator | changed: [testbed-node-1] 2026-04-07 01:00:42.680649 | orchestrator | changed: [testbed-node-2] 2026-04-07 01:00:42.680654 | orchestrator | 2026-04-07 01:00:42.680658 | orchestrator | TASK [service-check-containers : grafana | Check containers] ******************* 2026-04-07 01:00:42.680663 | orchestrator | Tuesday 07 April 2026 01:00:38 +0000 (0:00:01.192) 0:00:13.903 ********* 2026-04-07 01:00:42.680668 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680673 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680684 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-07 01:00:42.680690 | orchestrator | 2026-04-07 01:00:42.680696 | orchestrator | TASK [service-check-containers : grafana | Notify handlers to restart containers] *** 2026-04-07 01:00:42.680706 | orchestrator | Tuesday 07 April 2026 01:00:39 +0000 (0:00:00.967) 0:00:14.870 ********* 2026-04-07 01:00:42.680712 | orchestrator | changed: [testbed-node-0] => { 2026-04-07 01:00:42.680718 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:42.680724 | orchestrator | } 2026-04-07 01:00:42.680730 | orchestrator | changed: [testbed-node-1] => { 2026-04-07 01:00:42.680736 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:42.680742 | orchestrator | } 2026-04-07 01:00:42.680748 | orchestrator | changed: [testbed-node-2] => { 2026-04-07 01:00:42.680753 | orchestrator |  "msg": "Notifying handlers" 2026-04-07 01:00:42.680762 | orchestrator | } 2026-04-07 01:00:42.680768 | orchestrator | 2026-04-07 01:00:42.680774 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-07 01:00:42.680780 | orchestrator | Tuesday 07 April 2026 01:00:39 +0000 (0:00:00.282) 0:00:15.153 ********* 2026-04-07 01:00:42.680787 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680793 | orchestrator | skipping: [testbed-node-0] 2026-04-07 01:00:42.680804 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680811 | orchestrator | skipping: [testbed-node-1] 2026-04-07 01:00:42.680818 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release//grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-07 01:00:42.680824 | orchestrator | skipping: [testbed-node-2] 2026-04-07 01:00:42.680830 | orchestrator | 2026-04-07 01:00:42.680837 | orchestrator | TASK [grafana : Creating grafana database] ************************************* 2026-04-07 01:00:42.680843 | orchestrator | Tuesday 07 April 2026 01:00:40 +0000 (0:00:00.673) 0:00:15.826 ********* 2026-04-07 01:00:42.680849 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-07 01:00:42.680856 | orchestrator | 2026-04-07 01:00:42.680863 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-07 01:00:42.680881 | orchestrator | testbed-node-0 : ok=16  changed=9  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-07 01:00:42.680888 | orchestrator | testbed-node-1 : ok=14  changed=9  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-04-07 01:00:42.680895 | orchestrator | testbed-node-2 : ok=14  changed=9  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-04-07 01:00:42.680901 | orchestrator | 2026-04-07 01:00:42.680907 | orchestrator | 2026-04-07 01:00:42.680914 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-07 01:00:42.680918 | orchestrator | Tuesday 07 April 2026 01:00:40 +0000 (0:00:00.844) 0:00:16.671 ********* 2026-04-07 01:00:42.680922 | orchestrator | =============================================================================== 2026-04-07 01:00:42.680926 | orchestrator | service-cert-copy : grafana | Copying over extra CA certificates -------- 1.45s 2026-04-07 01:00:42.680930 | orchestrator | grafana : Copying over grafana.ini -------------------------------------- 1.40s 2026-04-07 01:00:42.680934 | orchestrator | grafana : Copying over config.json files -------------------------------- 1.28s 2026-04-07 01:00:42.680938 | orchestrator | grafana : Copying over custom dashboards -------------------------------- 1.19s 2026-04-07 01:00:42.680942 | orchestrator | grafana : Configuring dashboards provisioning --------------------------- 1.18s 2026-04-07 01:00:42.680945 | orchestrator | grafana : Configuring Prometheus as data source for Grafana ------------- 1.12s 2026-04-07 01:00:42.680949 | orchestrator | grafana : Ensuring config directories exist ----------------------------- 1.03s 2026-04-07 01:00:42.680953 | orchestrator | service-check-containers : grafana | Check containers ------------------- 0.97s 2026-04-07 01:00:42.680957 | orchestrator | grafana : Creating grafana database ------------------------------------- 0.85s 2026-04-07 01:00:42.680961 | orchestrator | grafana : Remove templated Grafana dashboards --------------------------- 0.80s 2026-04-07 01:00:42.680965 | orchestrator | grafana : Check if extra configuration file exists ---------------------- 0.75s 2026-04-07 01:00:42.680968 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.67s 2026-04-07 01:00:42.680972 | orchestrator | grafana : Check if the folder for custom grafana dashboards exists ------ 0.67s 2026-04-07 01:00:42.680976 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS key ----- 0.61s 2026-04-07 01:00:42.680980 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.51s 2026-04-07 01:00:42.680984 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS certificate --- 0.44s 2026-04-07 01:00:42.680988 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.42s 2026-04-07 01:00:42.680991 | orchestrator | service-check-containers : grafana | Notify handlers to restart containers --- 0.28s 2026-04-07 01:00:42.681002 | orchestrator | grafana : Copying over extra configuration file ------------------------- 0.28s 2026-04-07 01:00:42.681006 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.25s 2026-04-07 01:00:42.681010 | orchestrator | 2026-04-07 01:00:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:42.682140 | orchestrator | 2026-04-07 01:00:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:42.682219 | orchestrator | 2026-04-07 01:00:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:45.728316 | orchestrator | 2026-04-07 01:00:45 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state STARTED 2026-04-07 01:00:45.731006 | orchestrator | 2026-04-07 01:00:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:45.732631 | orchestrator | 2026-04-07 01:00:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:45.732689 | orchestrator | 2026-04-07 01:00:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:48.771856 | orchestrator | 2026-04-07 01:00:48 | INFO  | Task a5c8be9d-5e96-4af9-9913-9801c82353ef is in state SUCCESS 2026-04-07 01:00:48.776955 | orchestrator | 2026-04-07 01:00:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:48.780197 | orchestrator | 2026-04-07 01:00:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:48.780318 | orchestrator | 2026-04-07 01:00:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:51.819333 | orchestrator | 2026-04-07 01:00:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:51.819419 | orchestrator | 2026-04-07 01:00:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:51.819429 | orchestrator | 2026-04-07 01:00:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:54.867113 | orchestrator | 2026-04-07 01:00:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:54.869998 | orchestrator | 2026-04-07 01:00:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:54.870134 | orchestrator | 2026-04-07 01:00:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:00:57.919641 | orchestrator | 2026-04-07 01:00:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:00:57.921024 | orchestrator | 2026-04-07 01:00:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:00:57.921085 | orchestrator | 2026-04-07 01:00:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:00.968387 | orchestrator | 2026-04-07 01:01:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:00.969825 | orchestrator | 2026-04-07 01:01:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:00.969859 | orchestrator | 2026-04-07 01:01:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:04.020544 | orchestrator | 2026-04-07 01:01:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:04.022842 | orchestrator | 2026-04-07 01:01:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:04.022893 | orchestrator | 2026-04-07 01:01:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:07.065741 | orchestrator | 2026-04-07 01:01:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:07.067718 | orchestrator | 2026-04-07 01:01:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:07.067791 | orchestrator | 2026-04-07 01:01:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:10.114846 | orchestrator | 2026-04-07 01:01:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:10.117640 | orchestrator | 2026-04-07 01:01:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:10.117725 | orchestrator | 2026-04-07 01:01:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:13.164780 | orchestrator | 2026-04-07 01:01:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:13.168045 | orchestrator | 2026-04-07 01:01:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:13.168116 | orchestrator | 2026-04-07 01:01:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:16.216638 | orchestrator | 2026-04-07 01:01:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:16.219185 | orchestrator | 2026-04-07 01:01:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:16.219244 | orchestrator | 2026-04-07 01:01:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:19.264339 | orchestrator | 2026-04-07 01:01:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:19.266181 | orchestrator | 2026-04-07 01:01:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:19.266244 | orchestrator | 2026-04-07 01:01:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:22.312989 | orchestrator | 2026-04-07 01:01:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:22.315012 | orchestrator | 2026-04-07 01:01:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:22.315073 | orchestrator | 2026-04-07 01:01:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:25.357630 | orchestrator | 2026-04-07 01:01:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:25.359694 | orchestrator | 2026-04-07 01:01:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:25.359753 | orchestrator | 2026-04-07 01:01:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:28.404024 | orchestrator | 2026-04-07 01:01:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:28.405718 | orchestrator | 2026-04-07 01:01:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:28.405763 | orchestrator | 2026-04-07 01:01:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:31.450529 | orchestrator | 2026-04-07 01:01:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:31.451565 | orchestrator | 2026-04-07 01:01:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:31.451604 | orchestrator | 2026-04-07 01:01:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:34.497351 | orchestrator | 2026-04-07 01:01:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:34.498597 | orchestrator | 2026-04-07 01:01:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:34.498663 | orchestrator | 2026-04-07 01:01:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:37.543717 | orchestrator | 2026-04-07 01:01:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:37.546203 | orchestrator | 2026-04-07 01:01:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:37.546358 | orchestrator | 2026-04-07 01:01:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:40.583389 | orchestrator | 2026-04-07 01:01:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:40.585929 | orchestrator | 2026-04-07 01:01:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:40.586087 | orchestrator | 2026-04-07 01:01:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:43.634248 | orchestrator | 2026-04-07 01:01:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:43.636144 | orchestrator | 2026-04-07 01:01:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:43.636200 | orchestrator | 2026-04-07 01:01:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:46.685529 | orchestrator | 2026-04-07 01:01:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:46.687752 | orchestrator | 2026-04-07 01:01:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:46.687838 | orchestrator | 2026-04-07 01:01:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:49.739426 | orchestrator | 2026-04-07 01:01:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:49.741341 | orchestrator | 2026-04-07 01:01:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:49.741409 | orchestrator | 2026-04-07 01:01:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:52.786743 | orchestrator | 2026-04-07 01:01:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:52.788681 | orchestrator | 2026-04-07 01:01:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:52.788750 | orchestrator | 2026-04-07 01:01:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:55.832215 | orchestrator | 2026-04-07 01:01:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:55.833635 | orchestrator | 2026-04-07 01:01:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:55.833759 | orchestrator | 2026-04-07 01:01:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:01:58.871633 | orchestrator | 2026-04-07 01:01:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:01:58.872605 | orchestrator | 2026-04-07 01:01:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:01:58.872646 | orchestrator | 2026-04-07 01:01:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:01.915144 | orchestrator | 2026-04-07 01:02:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:01.917510 | orchestrator | 2026-04-07 01:02:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:01.917573 | orchestrator | 2026-04-07 01:02:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:04.959426 | orchestrator | 2026-04-07 01:02:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:04.960634 | orchestrator | 2026-04-07 01:02:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:04.960682 | orchestrator | 2026-04-07 01:02:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:08.004092 | orchestrator | 2026-04-07 01:02:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:08.006406 | orchestrator | 2026-04-07 01:02:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:08.006467 | orchestrator | 2026-04-07 01:02:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:11.045080 | orchestrator | 2026-04-07 01:02:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:11.047483 | orchestrator | 2026-04-07 01:02:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:11.047564 | orchestrator | 2026-04-07 01:02:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:14.084248 | orchestrator | 2026-04-07 01:02:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:14.087230 | orchestrator | 2026-04-07 01:02:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:14.087355 | orchestrator | 2026-04-07 01:02:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:17.126346 | orchestrator | 2026-04-07 01:02:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:17.128706 | orchestrator | 2026-04-07 01:02:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:17.128761 | orchestrator | 2026-04-07 01:02:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:20.165250 | orchestrator | 2026-04-07 01:02:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:20.166852 | orchestrator | 2026-04-07 01:02:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:20.166915 | orchestrator | 2026-04-07 01:02:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:23.212913 | orchestrator | 2026-04-07 01:02:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:23.214218 | orchestrator | 2026-04-07 01:02:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:23.214260 | orchestrator | 2026-04-07 01:02:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:26.269109 | orchestrator | 2026-04-07 01:02:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:26.270760 | orchestrator | 2026-04-07 01:02:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:26.270828 | orchestrator | 2026-04-07 01:02:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:29.321691 | orchestrator | 2026-04-07 01:02:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:29.324891 | orchestrator | 2026-04-07 01:02:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:29.324938 | orchestrator | 2026-04-07 01:02:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:32.361707 | orchestrator | 2026-04-07 01:02:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:32.363321 | orchestrator | 2026-04-07 01:02:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:32.363381 | orchestrator | 2026-04-07 01:02:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:35.410167 | orchestrator | 2026-04-07 01:02:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:35.412156 | orchestrator | 2026-04-07 01:02:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:35.412207 | orchestrator | 2026-04-07 01:02:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:38.452653 | orchestrator | 2026-04-07 01:02:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:38.454960 | orchestrator | 2026-04-07 01:02:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:38.455055 | orchestrator | 2026-04-07 01:02:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:41.498396 | orchestrator | 2026-04-07 01:02:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:41.499499 | orchestrator | 2026-04-07 01:02:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:41.499517 | orchestrator | 2026-04-07 01:02:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:44.549449 | orchestrator | 2026-04-07 01:02:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:44.551287 | orchestrator | 2026-04-07 01:02:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:44.551406 | orchestrator | 2026-04-07 01:02:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:47.598812 | orchestrator | 2026-04-07 01:02:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:47.600088 | orchestrator | 2026-04-07 01:02:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:47.600150 | orchestrator | 2026-04-07 01:02:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:50.646962 | orchestrator | 2026-04-07 01:02:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:50.648985 | orchestrator | 2026-04-07 01:02:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:50.649054 | orchestrator | 2026-04-07 01:02:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:53.680762 | orchestrator | 2026-04-07 01:02:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:53.683089 | orchestrator | 2026-04-07 01:02:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:53.683177 | orchestrator | 2026-04-07 01:02:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:56.720833 | orchestrator | 2026-04-07 01:02:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:56.724825 | orchestrator | 2026-04-07 01:02:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:56.724911 | orchestrator | 2026-04-07 01:02:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:02:59.769110 | orchestrator | 2026-04-07 01:02:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:02:59.771943 | orchestrator | 2026-04-07 01:02:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:02:59.772005 | orchestrator | 2026-04-07 01:02:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:02.824869 | orchestrator | 2026-04-07 01:03:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:02.825513 | orchestrator | 2026-04-07 01:03:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:02.825556 | orchestrator | 2026-04-07 01:03:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:05.878678 | orchestrator | 2026-04-07 01:03:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:05.880551 | orchestrator | 2026-04-07 01:03:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:05.880603 | orchestrator | 2026-04-07 01:03:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:08.924001 | orchestrator | 2026-04-07 01:03:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:08.925177 | orchestrator | 2026-04-07 01:03:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:08.925231 | orchestrator | 2026-04-07 01:03:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:11.970207 | orchestrator | 2026-04-07 01:03:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:11.972799 | orchestrator | 2026-04-07 01:03:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:11.972902 | orchestrator | 2026-04-07 01:03:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:15.008643 | orchestrator | 2026-04-07 01:03:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:15.008802 | orchestrator | 2026-04-07 01:03:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:15.008820 | orchestrator | 2026-04-07 01:03:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:18.053844 | orchestrator | 2026-04-07 01:03:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:18.055898 | orchestrator | 2026-04-07 01:03:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:18.056024 | orchestrator | 2026-04-07 01:03:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:21.100769 | orchestrator | 2026-04-07 01:03:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:21.102392 | orchestrator | 2026-04-07 01:03:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:21.102473 | orchestrator | 2026-04-07 01:03:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:24.147703 | orchestrator | 2026-04-07 01:03:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:24.148704 | orchestrator | 2026-04-07 01:03:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:24.148751 | orchestrator | 2026-04-07 01:03:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:27.193957 | orchestrator | 2026-04-07 01:03:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:27.195555 | orchestrator | 2026-04-07 01:03:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:27.195603 | orchestrator | 2026-04-07 01:03:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:30.236694 | orchestrator | 2026-04-07 01:03:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:30.238413 | orchestrator | 2026-04-07 01:03:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:30.238479 | orchestrator | 2026-04-07 01:03:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:33.286597 | orchestrator | 2026-04-07 01:03:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:33.287933 | orchestrator | 2026-04-07 01:03:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:33.288066 | orchestrator | 2026-04-07 01:03:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:36.334915 | orchestrator | 2026-04-07 01:03:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:36.336721 | orchestrator | 2026-04-07 01:03:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:36.336875 | orchestrator | 2026-04-07 01:03:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:39.375689 | orchestrator | 2026-04-07 01:03:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:39.378633 | orchestrator | 2026-04-07 01:03:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:39.378691 | orchestrator | 2026-04-07 01:03:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:42.432234 | orchestrator | 2026-04-07 01:03:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:42.434124 | orchestrator | 2026-04-07 01:03:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:42.434264 | orchestrator | 2026-04-07 01:03:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:45.480171 | orchestrator | 2026-04-07 01:03:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:45.481245 | orchestrator | 2026-04-07 01:03:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:45.481291 | orchestrator | 2026-04-07 01:03:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:48.521889 | orchestrator | 2026-04-07 01:03:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:48.524223 | orchestrator | 2026-04-07 01:03:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:48.524283 | orchestrator | 2026-04-07 01:03:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:51.564909 | orchestrator | 2026-04-07 01:03:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:51.566831 | orchestrator | 2026-04-07 01:03:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:51.566896 | orchestrator | 2026-04-07 01:03:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:54.601602 | orchestrator | 2026-04-07 01:03:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:54.603303 | orchestrator | 2026-04-07 01:03:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:54.603364 | orchestrator | 2026-04-07 01:03:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:03:57.639905 | orchestrator | 2026-04-07 01:03:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:03:57.640406 | orchestrator | 2026-04-07 01:03:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:03:57.640592 | orchestrator | 2026-04-07 01:03:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:00.679006 | orchestrator | 2026-04-07 01:04:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:00.679976 | orchestrator | 2026-04-07 01:04:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:00.680020 | orchestrator | 2026-04-07 01:04:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:03.712226 | orchestrator | 2026-04-07 01:04:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:03.714534 | orchestrator | 2026-04-07 01:04:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:03.714614 | orchestrator | 2026-04-07 01:04:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:06.750558 | orchestrator | 2026-04-07 01:04:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:06.752583 | orchestrator | 2026-04-07 01:04:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:06.752809 | orchestrator | 2026-04-07 01:04:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:09.794846 | orchestrator | 2026-04-07 01:04:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:09.796556 | orchestrator | 2026-04-07 01:04:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:09.796627 | orchestrator | 2026-04-07 01:04:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:12.837805 | orchestrator | 2026-04-07 01:04:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:12.840884 | orchestrator | 2026-04-07 01:04:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:12.840936 | orchestrator | 2026-04-07 01:04:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:15.884663 | orchestrator | 2026-04-07 01:04:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:15.885882 | orchestrator | 2026-04-07 01:04:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:15.886114 | orchestrator | 2026-04-07 01:04:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:18.929283 | orchestrator | 2026-04-07 01:04:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:18.931184 | orchestrator | 2026-04-07 01:04:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:18.931254 | orchestrator | 2026-04-07 01:04:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:21.977156 | orchestrator | 2026-04-07 01:04:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:21.978931 | orchestrator | 2026-04-07 01:04:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:21.978985 | orchestrator | 2026-04-07 01:04:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:25.020900 | orchestrator | 2026-04-07 01:04:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:25.021225 | orchestrator | 2026-04-07 01:04:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:25.021243 | orchestrator | 2026-04-07 01:04:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:28.065246 | orchestrator | 2026-04-07 01:04:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:28.067024 | orchestrator | 2026-04-07 01:04:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:28.067130 | orchestrator | 2026-04-07 01:04:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:31.106115 | orchestrator | 2026-04-07 01:04:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:31.108085 | orchestrator | 2026-04-07 01:04:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:31.108170 | orchestrator | 2026-04-07 01:04:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:34.152383 | orchestrator | 2026-04-07 01:04:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:34.154707 | orchestrator | 2026-04-07 01:04:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:34.154787 | orchestrator | 2026-04-07 01:04:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:37.201964 | orchestrator | 2026-04-07 01:04:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:37.202934 | orchestrator | 2026-04-07 01:04:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:37.202995 | orchestrator | 2026-04-07 01:04:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:40.244352 | orchestrator | 2026-04-07 01:04:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:40.248087 | orchestrator | 2026-04-07 01:04:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:40.248173 | orchestrator | 2026-04-07 01:04:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:43.286701 | orchestrator | 2026-04-07 01:04:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:43.289527 | orchestrator | 2026-04-07 01:04:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:43.289596 | orchestrator | 2026-04-07 01:04:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:46.332051 | orchestrator | 2026-04-07 01:04:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:46.334322 | orchestrator | 2026-04-07 01:04:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:46.334428 | orchestrator | 2026-04-07 01:04:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:49.378712 | orchestrator | 2026-04-07 01:04:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:49.379965 | orchestrator | 2026-04-07 01:04:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:49.380023 | orchestrator | 2026-04-07 01:04:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:52.425430 | orchestrator | 2026-04-07 01:04:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:52.425969 | orchestrator | 2026-04-07 01:04:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:52.426160 | orchestrator | 2026-04-07 01:04:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:55.471984 | orchestrator | 2026-04-07 01:04:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:55.473921 | orchestrator | 2026-04-07 01:04:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:55.473991 | orchestrator | 2026-04-07 01:04:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:04:58.517480 | orchestrator | 2026-04-07 01:04:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:04:58.519762 | orchestrator | 2026-04-07 01:04:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:04:58.519795 | orchestrator | 2026-04-07 01:04:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:01.557617 | orchestrator | 2026-04-07 01:05:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:01.559400 | orchestrator | 2026-04-07 01:05:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:01.559457 | orchestrator | 2026-04-07 01:05:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:04.595849 | orchestrator | 2026-04-07 01:05:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:04.595971 | orchestrator | 2026-04-07 01:05:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:04.596016 | orchestrator | 2026-04-07 01:05:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:07.637594 | orchestrator | 2026-04-07 01:05:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:07.638777 | orchestrator | 2026-04-07 01:05:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:07.638844 | orchestrator | 2026-04-07 01:05:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:10.682879 | orchestrator | 2026-04-07 01:05:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:10.684524 | orchestrator | 2026-04-07 01:05:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:10.684591 | orchestrator | 2026-04-07 01:05:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:13.724175 | orchestrator | 2026-04-07 01:05:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:13.726149 | orchestrator | 2026-04-07 01:05:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:13.726211 | orchestrator | 2026-04-07 01:05:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:16.768787 | orchestrator | 2026-04-07 01:05:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:16.770266 | orchestrator | 2026-04-07 01:05:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:16.770325 | orchestrator | 2026-04-07 01:05:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:19.818993 | orchestrator | 2026-04-07 01:05:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:19.820595 | orchestrator | 2026-04-07 01:05:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:19.820749 | orchestrator | 2026-04-07 01:05:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:22.858197 | orchestrator | 2026-04-07 01:05:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:22.859027 | orchestrator | 2026-04-07 01:05:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:22.859152 | orchestrator | 2026-04-07 01:05:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:25.906765 | orchestrator | 2026-04-07 01:05:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:25.908337 | orchestrator | 2026-04-07 01:05:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:25.908423 | orchestrator | 2026-04-07 01:05:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:28.954953 | orchestrator | 2026-04-07 01:05:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:28.956425 | orchestrator | 2026-04-07 01:05:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:28.956481 | orchestrator | 2026-04-07 01:05:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:31.993665 | orchestrator | 2026-04-07 01:05:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:31.995360 | orchestrator | 2026-04-07 01:05:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:31.995450 | orchestrator | 2026-04-07 01:05:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:35.040646 | orchestrator | 2026-04-07 01:05:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:35.042247 | orchestrator | 2026-04-07 01:05:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:35.042306 | orchestrator | 2026-04-07 01:05:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:38.083454 | orchestrator | 2026-04-07 01:05:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:38.084796 | orchestrator | 2026-04-07 01:05:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:38.084843 | orchestrator | 2026-04-07 01:05:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:41.129909 | orchestrator | 2026-04-07 01:05:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:41.133048 | orchestrator | 2026-04-07 01:05:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:41.133132 | orchestrator | 2026-04-07 01:05:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:44.170287 | orchestrator | 2026-04-07 01:05:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:44.172657 | orchestrator | 2026-04-07 01:05:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:44.172723 | orchestrator | 2026-04-07 01:05:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:47.216867 | orchestrator | 2026-04-07 01:05:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:47.218682 | orchestrator | 2026-04-07 01:05:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:47.218748 | orchestrator | 2026-04-07 01:05:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:50.258522 | orchestrator | 2026-04-07 01:05:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:50.260642 | orchestrator | 2026-04-07 01:05:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:50.260704 | orchestrator | 2026-04-07 01:05:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:53.305680 | orchestrator | 2026-04-07 01:05:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:53.307867 | orchestrator | 2026-04-07 01:05:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:53.307926 | orchestrator | 2026-04-07 01:05:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:56.354934 | orchestrator | 2026-04-07 01:05:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:56.356687 | orchestrator | 2026-04-07 01:05:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:56.356759 | orchestrator | 2026-04-07 01:05:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:05:59.397938 | orchestrator | 2026-04-07 01:05:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:05:59.399451 | orchestrator | 2026-04-07 01:05:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:05:59.399472 | orchestrator | 2026-04-07 01:05:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:02.443863 | orchestrator | 2026-04-07 01:06:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:02.445600 | orchestrator | 2026-04-07 01:06:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:02.445630 | orchestrator | 2026-04-07 01:06:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:05.497520 | orchestrator | 2026-04-07 01:06:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:05.499026 | orchestrator | 2026-04-07 01:06:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:05.499297 | orchestrator | 2026-04-07 01:06:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:08.544513 | orchestrator | 2026-04-07 01:06:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:08.545944 | orchestrator | 2026-04-07 01:06:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:08.546118 | orchestrator | 2026-04-07 01:06:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:11.589111 | orchestrator | 2026-04-07 01:06:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:11.590987 | orchestrator | 2026-04-07 01:06:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:11.591041 | orchestrator | 2026-04-07 01:06:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:14.632041 | orchestrator | 2026-04-07 01:06:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:14.633744 | orchestrator | 2026-04-07 01:06:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:14.633787 | orchestrator | 2026-04-07 01:06:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:17.673825 | orchestrator | 2026-04-07 01:06:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:17.675225 | orchestrator | 2026-04-07 01:06:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:17.675267 | orchestrator | 2026-04-07 01:06:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:20.717103 | orchestrator | 2026-04-07 01:06:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:20.717571 | orchestrator | 2026-04-07 01:06:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:20.717588 | orchestrator | 2026-04-07 01:06:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:23.765042 | orchestrator | 2026-04-07 01:06:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:23.765145 | orchestrator | 2026-04-07 01:06:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:23.765158 | orchestrator | 2026-04-07 01:06:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:26.803249 | orchestrator | 2026-04-07 01:06:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:26.805503 | orchestrator | 2026-04-07 01:06:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:26.805629 | orchestrator | 2026-04-07 01:06:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:29.845479 | orchestrator | 2026-04-07 01:06:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:29.848192 | orchestrator | 2026-04-07 01:06:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:29.848252 | orchestrator | 2026-04-07 01:06:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:32.895604 | orchestrator | 2026-04-07 01:06:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:32.898166 | orchestrator | 2026-04-07 01:06:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:32.898217 | orchestrator | 2026-04-07 01:06:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:35.938984 | orchestrator | 2026-04-07 01:06:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:35.940912 | orchestrator | 2026-04-07 01:06:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:35.940960 | orchestrator | 2026-04-07 01:06:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:38.983451 | orchestrator | 2026-04-07 01:06:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:38.985135 | orchestrator | 2026-04-07 01:06:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:38.985210 | orchestrator | 2026-04-07 01:06:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:42.026859 | orchestrator | 2026-04-07 01:06:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:42.028671 | orchestrator | 2026-04-07 01:06:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:42.028763 | orchestrator | 2026-04-07 01:06:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:45.066137 | orchestrator | 2026-04-07 01:06:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:45.067942 | orchestrator | 2026-04-07 01:06:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:45.068117 | orchestrator | 2026-04-07 01:06:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:48.118211 | orchestrator | 2026-04-07 01:06:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:48.119776 | orchestrator | 2026-04-07 01:06:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:48.119840 | orchestrator | 2026-04-07 01:06:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:51.161641 | orchestrator | 2026-04-07 01:06:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:51.163941 | orchestrator | 2026-04-07 01:06:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:51.164057 | orchestrator | 2026-04-07 01:06:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:54.207325 | orchestrator | 2026-04-07 01:06:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:54.209981 | orchestrator | 2026-04-07 01:06:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:54.210084 | orchestrator | 2026-04-07 01:06:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:06:57.255665 | orchestrator | 2026-04-07 01:06:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:06:57.257513 | orchestrator | 2026-04-07 01:06:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:06:57.257565 | orchestrator | 2026-04-07 01:06:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:00.299354 | orchestrator | 2026-04-07 01:07:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:00.301372 | orchestrator | 2026-04-07 01:07:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:00.301493 | orchestrator | 2026-04-07 01:07:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:03.342244 | orchestrator | 2026-04-07 01:07:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:03.343989 | orchestrator | 2026-04-07 01:07:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:03.344051 | orchestrator | 2026-04-07 01:07:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:06.390158 | orchestrator | 2026-04-07 01:07:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:06.393837 | orchestrator | 2026-04-07 01:07:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:06.393883 | orchestrator | 2026-04-07 01:07:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:09.435454 | orchestrator | 2026-04-07 01:07:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:09.437707 | orchestrator | 2026-04-07 01:07:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:09.437759 | orchestrator | 2026-04-07 01:07:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:12.488329 | orchestrator | 2026-04-07 01:07:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:12.490125 | orchestrator | 2026-04-07 01:07:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:12.490206 | orchestrator | 2026-04-07 01:07:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:15.539028 | orchestrator | 2026-04-07 01:07:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:15.541081 | orchestrator | 2026-04-07 01:07:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:15.541157 | orchestrator | 2026-04-07 01:07:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:18.583703 | orchestrator | 2026-04-07 01:07:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:18.586059 | orchestrator | 2026-04-07 01:07:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:18.586120 | orchestrator | 2026-04-07 01:07:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:21.630162 | orchestrator | 2026-04-07 01:07:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:21.631333 | orchestrator | 2026-04-07 01:07:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:21.631385 | orchestrator | 2026-04-07 01:07:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:24.675268 | orchestrator | 2026-04-07 01:07:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:24.676601 | orchestrator | 2026-04-07 01:07:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:24.676713 | orchestrator | 2026-04-07 01:07:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:27.723979 | orchestrator | 2026-04-07 01:07:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:27.725888 | orchestrator | 2026-04-07 01:07:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:27.725940 | orchestrator | 2026-04-07 01:07:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:30.771739 | orchestrator | 2026-04-07 01:07:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:30.773609 | orchestrator | 2026-04-07 01:07:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:30.773674 | orchestrator | 2026-04-07 01:07:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:33.813068 | orchestrator | 2026-04-07 01:07:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:33.814309 | orchestrator | 2026-04-07 01:07:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:33.814375 | orchestrator | 2026-04-07 01:07:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:36.858361 | orchestrator | 2026-04-07 01:07:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:36.860339 | orchestrator | 2026-04-07 01:07:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:36.860423 | orchestrator | 2026-04-07 01:07:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:39.904732 | orchestrator | 2026-04-07 01:07:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:39.906284 | orchestrator | 2026-04-07 01:07:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:39.906437 | orchestrator | 2026-04-07 01:07:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:42.950393 | orchestrator | 2026-04-07 01:07:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:42.952740 | orchestrator | 2026-04-07 01:07:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:42.952815 | orchestrator | 2026-04-07 01:07:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:45.989938 | orchestrator | 2026-04-07 01:07:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:45.991858 | orchestrator | 2026-04-07 01:07:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:45.991923 | orchestrator | 2026-04-07 01:07:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:49.030635 | orchestrator | 2026-04-07 01:07:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:49.032361 | orchestrator | 2026-04-07 01:07:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:49.032430 | orchestrator | 2026-04-07 01:07:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:52.072143 | orchestrator | 2026-04-07 01:07:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:52.073156 | orchestrator | 2026-04-07 01:07:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:52.073195 | orchestrator | 2026-04-07 01:07:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:55.108598 | orchestrator | 2026-04-07 01:07:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:55.108687 | orchestrator | 2026-04-07 01:07:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:55.108697 | orchestrator | 2026-04-07 01:07:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:07:58.149832 | orchestrator | 2026-04-07 01:07:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:07:58.151638 | orchestrator | 2026-04-07 01:07:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:07:58.151695 | orchestrator | 2026-04-07 01:07:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:01.191554 | orchestrator | 2026-04-07 01:08:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:01.193488 | orchestrator | 2026-04-07 01:08:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:01.193535 | orchestrator | 2026-04-07 01:08:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:04.241457 | orchestrator | 2026-04-07 01:08:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:04.243301 | orchestrator | 2026-04-07 01:08:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:04.243409 | orchestrator | 2026-04-07 01:08:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:07.281738 | orchestrator | 2026-04-07 01:08:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:07.283713 | orchestrator | 2026-04-07 01:08:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:07.283798 | orchestrator | 2026-04-07 01:08:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:10.324986 | orchestrator | 2026-04-07 01:08:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:10.326707 | orchestrator | 2026-04-07 01:08:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:10.326759 | orchestrator | 2026-04-07 01:08:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:13.369399 | orchestrator | 2026-04-07 01:08:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:13.372107 | orchestrator | 2026-04-07 01:08:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:13.372196 | orchestrator | 2026-04-07 01:08:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:16.419582 | orchestrator | 2026-04-07 01:08:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:16.421638 | orchestrator | 2026-04-07 01:08:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:16.421687 | orchestrator | 2026-04-07 01:08:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:19.464013 | orchestrator | 2026-04-07 01:08:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:19.465498 | orchestrator | 2026-04-07 01:08:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:19.465670 | orchestrator | 2026-04-07 01:08:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:22.502265 | orchestrator | 2026-04-07 01:08:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:22.504294 | orchestrator | 2026-04-07 01:08:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:22.504347 | orchestrator | 2026-04-07 01:08:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:25.554187 | orchestrator | 2026-04-07 01:08:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:25.556209 | orchestrator | 2026-04-07 01:08:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:25.556255 | orchestrator | 2026-04-07 01:08:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:28.600897 | orchestrator | 2026-04-07 01:08:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:28.605818 | orchestrator | 2026-04-07 01:08:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:28.605966 | orchestrator | 2026-04-07 01:08:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:31.651655 | orchestrator | 2026-04-07 01:08:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:31.652476 | orchestrator | 2026-04-07 01:08:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:31.652528 | orchestrator | 2026-04-07 01:08:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:34.701738 | orchestrator | 2026-04-07 01:08:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:34.703601 | orchestrator | 2026-04-07 01:08:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:34.703652 | orchestrator | 2026-04-07 01:08:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:37.747466 | orchestrator | 2026-04-07 01:08:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:37.750073 | orchestrator | 2026-04-07 01:08:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:37.750206 | orchestrator | 2026-04-07 01:08:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:40.794060 | orchestrator | 2026-04-07 01:08:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:40.796322 | orchestrator | 2026-04-07 01:08:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:40.796422 | orchestrator | 2026-04-07 01:08:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:43.840949 | orchestrator | 2026-04-07 01:08:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:43.842760 | orchestrator | 2026-04-07 01:08:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:43.842831 | orchestrator | 2026-04-07 01:08:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:46.885533 | orchestrator | 2026-04-07 01:08:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:46.887459 | orchestrator | 2026-04-07 01:08:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:46.887552 | orchestrator | 2026-04-07 01:08:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:49.926415 | orchestrator | 2026-04-07 01:08:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:49.929655 | orchestrator | 2026-04-07 01:08:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:49.931486 | orchestrator | 2026-04-07 01:08:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:52.979647 | orchestrator | 2026-04-07 01:08:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:52.981396 | orchestrator | 2026-04-07 01:08:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:52.981448 | orchestrator | 2026-04-07 01:08:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:56.024607 | orchestrator | 2026-04-07 01:08:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:56.025576 | orchestrator | 2026-04-07 01:08:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:56.025619 | orchestrator | 2026-04-07 01:08:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:08:59.058615 | orchestrator | 2026-04-07 01:08:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:08:59.059716 | orchestrator | 2026-04-07 01:08:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:08:59.059879 | orchestrator | 2026-04-07 01:08:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:02.106488 | orchestrator | 2026-04-07 01:09:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:02.106587 | orchestrator | 2026-04-07 01:09:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:02.106598 | orchestrator | 2026-04-07 01:09:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:05.147227 | orchestrator | 2026-04-07 01:09:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:05.149296 | orchestrator | 2026-04-07 01:09:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:05.149390 | orchestrator | 2026-04-07 01:09:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:08.196555 | orchestrator | 2026-04-07 01:09:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:08.199903 | orchestrator | 2026-04-07 01:09:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:08.199982 | orchestrator | 2026-04-07 01:09:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:11.234065 | orchestrator | 2026-04-07 01:09:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:11.234822 | orchestrator | 2026-04-07 01:09:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:11.234861 | orchestrator | 2026-04-07 01:09:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:14.276584 | orchestrator | 2026-04-07 01:09:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:14.279317 | orchestrator | 2026-04-07 01:09:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:14.279401 | orchestrator | 2026-04-07 01:09:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:17.323317 | orchestrator | 2026-04-07 01:09:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:17.324037 | orchestrator | 2026-04-07 01:09:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:17.324073 | orchestrator | 2026-04-07 01:09:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:20.363078 | orchestrator | 2026-04-07 01:09:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:20.365436 | orchestrator | 2026-04-07 01:09:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:20.365493 | orchestrator | 2026-04-07 01:09:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:23.406657 | orchestrator | 2026-04-07 01:09:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:23.406915 | orchestrator | 2026-04-07 01:09:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:23.406943 | orchestrator | 2026-04-07 01:09:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:26.451582 | orchestrator | 2026-04-07 01:09:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:26.452951 | orchestrator | 2026-04-07 01:09:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:26.453018 | orchestrator | 2026-04-07 01:09:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:29.498877 | orchestrator | 2026-04-07 01:09:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:29.500299 | orchestrator | 2026-04-07 01:09:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:29.500442 | orchestrator | 2026-04-07 01:09:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:32.548442 | orchestrator | 2026-04-07 01:09:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:32.550191 | orchestrator | 2026-04-07 01:09:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:32.550232 | orchestrator | 2026-04-07 01:09:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:35.597380 | orchestrator | 2026-04-07 01:09:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:35.598830 | orchestrator | 2026-04-07 01:09:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:35.598902 | orchestrator | 2026-04-07 01:09:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:38.634581 | orchestrator | 2026-04-07 01:09:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:38.636069 | orchestrator | 2026-04-07 01:09:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:38.636100 | orchestrator | 2026-04-07 01:09:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:41.680910 | orchestrator | 2026-04-07 01:09:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:41.682557 | orchestrator | 2026-04-07 01:09:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:41.682606 | orchestrator | 2026-04-07 01:09:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:44.724719 | orchestrator | 2026-04-07 01:09:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:44.726235 | orchestrator | 2026-04-07 01:09:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:44.726277 | orchestrator | 2026-04-07 01:09:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:47.763619 | orchestrator | 2026-04-07 01:09:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:47.765092 | orchestrator | 2026-04-07 01:09:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:47.765156 | orchestrator | 2026-04-07 01:09:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:50.806902 | orchestrator | 2026-04-07 01:09:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:50.809554 | orchestrator | 2026-04-07 01:09:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:50.809615 | orchestrator | 2026-04-07 01:09:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:53.855281 | orchestrator | 2026-04-07 01:09:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:53.859184 | orchestrator | 2026-04-07 01:09:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:53.859255 | orchestrator | 2026-04-07 01:09:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:56.910910 | orchestrator | 2026-04-07 01:09:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:56.912915 | orchestrator | 2026-04-07 01:09:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:56.913196 | orchestrator | 2026-04-07 01:09:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:09:59.952939 | orchestrator | 2026-04-07 01:09:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:09:59.955022 | orchestrator | 2026-04-07 01:09:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:09:59.955076 | orchestrator | 2026-04-07 01:09:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:03.006819 | orchestrator | 2026-04-07 01:10:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:03.010137 | orchestrator | 2026-04-07 01:10:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:03.010203 | orchestrator | 2026-04-07 01:10:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:06.061143 | orchestrator | 2026-04-07 01:10:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:06.062173 | orchestrator | 2026-04-07 01:10:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:06.062223 | orchestrator | 2026-04-07 01:10:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:09.111881 | orchestrator | 2026-04-07 01:10:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:09.113177 | orchestrator | 2026-04-07 01:10:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:09.113239 | orchestrator | 2026-04-07 01:10:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:12.158493 | orchestrator | 2026-04-07 01:10:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:12.160478 | orchestrator | 2026-04-07 01:10:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:12.160527 | orchestrator | 2026-04-07 01:10:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:15.203440 | orchestrator | 2026-04-07 01:10:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:15.204967 | orchestrator | 2026-04-07 01:10:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:15.205042 | orchestrator | 2026-04-07 01:10:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:18.253159 | orchestrator | 2026-04-07 01:10:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:18.254609 | orchestrator | 2026-04-07 01:10:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:18.254653 | orchestrator | 2026-04-07 01:10:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:21.302724 | orchestrator | 2026-04-07 01:10:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:21.304376 | orchestrator | 2026-04-07 01:10:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:21.304475 | orchestrator | 2026-04-07 01:10:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:24.350616 | orchestrator | 2026-04-07 01:10:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:24.352458 | orchestrator | 2026-04-07 01:10:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:24.352503 | orchestrator | 2026-04-07 01:10:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:27.401752 | orchestrator | 2026-04-07 01:10:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:27.403732 | orchestrator | 2026-04-07 01:10:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:27.403772 | orchestrator | 2026-04-07 01:10:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:30.449411 | orchestrator | 2026-04-07 01:10:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:30.452055 | orchestrator | 2026-04-07 01:10:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:30.452110 | orchestrator | 2026-04-07 01:10:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:33.497225 | orchestrator | 2026-04-07 01:10:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:33.498846 | orchestrator | 2026-04-07 01:10:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:33.499177 | orchestrator | 2026-04-07 01:10:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:36.543945 | orchestrator | 2026-04-07 01:10:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:36.545868 | orchestrator | 2026-04-07 01:10:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:36.545920 | orchestrator | 2026-04-07 01:10:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:39.593421 | orchestrator | 2026-04-07 01:10:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:39.596550 | orchestrator | 2026-04-07 01:10:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:39.596603 | orchestrator | 2026-04-07 01:10:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:42.645012 | orchestrator | 2026-04-07 01:10:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:42.647806 | orchestrator | 2026-04-07 01:10:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:42.647859 | orchestrator | 2026-04-07 01:10:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:45.690154 | orchestrator | 2026-04-07 01:10:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:45.692920 | orchestrator | 2026-04-07 01:10:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:45.693064 | orchestrator | 2026-04-07 01:10:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:48.741533 | orchestrator | 2026-04-07 01:10:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:48.742610 | orchestrator | 2026-04-07 01:10:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:48.742650 | orchestrator | 2026-04-07 01:10:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:51.786856 | orchestrator | 2026-04-07 01:10:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:51.787735 | orchestrator | 2026-04-07 01:10:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:51.787809 | orchestrator | 2026-04-07 01:10:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:54.830159 | orchestrator | 2026-04-07 01:10:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:54.832213 | orchestrator | 2026-04-07 01:10:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:54.832293 | orchestrator | 2026-04-07 01:10:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:10:57.867635 | orchestrator | 2026-04-07 01:10:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:10:57.868938 | orchestrator | 2026-04-07 01:10:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:10:57.869002 | orchestrator | 2026-04-07 01:10:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:00.914529 | orchestrator | 2026-04-07 01:11:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:00.917042 | orchestrator | 2026-04-07 01:11:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:00.917087 | orchestrator | 2026-04-07 01:11:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:03.967697 | orchestrator | 2026-04-07 01:11:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:03.969257 | orchestrator | 2026-04-07 01:11:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:03.969347 | orchestrator | 2026-04-07 01:11:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:07.035183 | orchestrator | 2026-04-07 01:11:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:07.037501 | orchestrator | 2026-04-07 01:11:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:07.037589 | orchestrator | 2026-04-07 01:11:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:10.083925 | orchestrator | 2026-04-07 01:11:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:10.085683 | orchestrator | 2026-04-07 01:11:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:10.085755 | orchestrator | 2026-04-07 01:11:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:13.124247 | orchestrator | 2026-04-07 01:11:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:13.126187 | orchestrator | 2026-04-07 01:11:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:13.126240 | orchestrator | 2026-04-07 01:11:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:16.184100 | orchestrator | 2026-04-07 01:11:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:16.185854 | orchestrator | 2026-04-07 01:11:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:16.185900 | orchestrator | 2026-04-07 01:11:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:19.229478 | orchestrator | 2026-04-07 01:11:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:19.229669 | orchestrator | 2026-04-07 01:11:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:19.229686 | orchestrator | 2026-04-07 01:11:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:22.272345 | orchestrator | 2026-04-07 01:11:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:22.275717 | orchestrator | 2026-04-07 01:11:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:22.275797 | orchestrator | 2026-04-07 01:11:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:25.322574 | orchestrator | 2026-04-07 01:11:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:25.324146 | orchestrator | 2026-04-07 01:11:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:25.324184 | orchestrator | 2026-04-07 01:11:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:28.365820 | orchestrator | 2026-04-07 01:11:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:28.367217 | orchestrator | 2026-04-07 01:11:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:28.367283 | orchestrator | 2026-04-07 01:11:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:31.410336 | orchestrator | 2026-04-07 01:11:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:31.413547 | orchestrator | 2026-04-07 01:11:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:31.413633 | orchestrator | 2026-04-07 01:11:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:34.458740 | orchestrator | 2026-04-07 01:11:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:34.460544 | orchestrator | 2026-04-07 01:11:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:34.460599 | orchestrator | 2026-04-07 01:11:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:37.515877 | orchestrator | 2026-04-07 01:11:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:37.516828 | orchestrator | 2026-04-07 01:11:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:37.516905 | orchestrator | 2026-04-07 01:11:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:40.560741 | orchestrator | 2026-04-07 01:11:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:40.561980 | orchestrator | 2026-04-07 01:11:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:40.562106 | orchestrator | 2026-04-07 01:11:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:43.599953 | orchestrator | 2026-04-07 01:11:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:43.601082 | orchestrator | 2026-04-07 01:11:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:43.601142 | orchestrator | 2026-04-07 01:11:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:46.653685 | orchestrator | 2026-04-07 01:11:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:46.655405 | orchestrator | 2026-04-07 01:11:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:46.655476 | orchestrator | 2026-04-07 01:11:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:49.700662 | orchestrator | 2026-04-07 01:11:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:49.703544 | orchestrator | 2026-04-07 01:11:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:49.703592 | orchestrator | 2026-04-07 01:11:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:52.756893 | orchestrator | 2026-04-07 01:11:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:52.759924 | orchestrator | 2026-04-07 01:11:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:52.759976 | orchestrator | 2026-04-07 01:11:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:55.803075 | orchestrator | 2026-04-07 01:11:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:55.804339 | orchestrator | 2026-04-07 01:11:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:55.804392 | orchestrator | 2026-04-07 01:11:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:11:58.852393 | orchestrator | 2026-04-07 01:11:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:11:58.855080 | orchestrator | 2026-04-07 01:11:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:11:58.855164 | orchestrator | 2026-04-07 01:11:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:01.905173 | orchestrator | 2026-04-07 01:12:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:01.906381 | orchestrator | 2026-04-07 01:12:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:01.906420 | orchestrator | 2026-04-07 01:12:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:04.954113 | orchestrator | 2026-04-07 01:12:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:04.956058 | orchestrator | 2026-04-07 01:12:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:04.956188 | orchestrator | 2026-04-07 01:12:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:08.000043 | orchestrator | 2026-04-07 01:12:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:08.000340 | orchestrator | 2026-04-07 01:12:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:08.000368 | orchestrator | 2026-04-07 01:12:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:11.043621 | orchestrator | 2026-04-07 01:12:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:11.045262 | orchestrator | 2026-04-07 01:12:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:11.045398 | orchestrator | 2026-04-07 01:12:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:14.088936 | orchestrator | 2026-04-07 01:12:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:14.090661 | orchestrator | 2026-04-07 01:12:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:14.090742 | orchestrator | 2026-04-07 01:12:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:17.135287 | orchestrator | 2026-04-07 01:12:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:17.136123 | orchestrator | 2026-04-07 01:12:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:17.136312 | orchestrator | 2026-04-07 01:12:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:20.183333 | orchestrator | 2026-04-07 01:12:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:20.184183 | orchestrator | 2026-04-07 01:12:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:20.184222 | orchestrator | 2026-04-07 01:12:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:23.233829 | orchestrator | 2026-04-07 01:12:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:23.235489 | orchestrator | 2026-04-07 01:12:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:23.235538 | orchestrator | 2026-04-07 01:12:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:26.276761 | orchestrator | 2026-04-07 01:12:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:26.278649 | orchestrator | 2026-04-07 01:12:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:26.278774 | orchestrator | 2026-04-07 01:12:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:29.322076 | orchestrator | 2026-04-07 01:12:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:29.323541 | orchestrator | 2026-04-07 01:12:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:29.323591 | orchestrator | 2026-04-07 01:12:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:32.366723 | orchestrator | 2026-04-07 01:12:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:32.368663 | orchestrator | 2026-04-07 01:12:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:32.368769 | orchestrator | 2026-04-07 01:12:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:35.406715 | orchestrator | 2026-04-07 01:12:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:35.408253 | orchestrator | 2026-04-07 01:12:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:35.408355 | orchestrator | 2026-04-07 01:12:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:38.454840 | orchestrator | 2026-04-07 01:12:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:38.456928 | orchestrator | 2026-04-07 01:12:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:38.456994 | orchestrator | 2026-04-07 01:12:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:41.505455 | orchestrator | 2026-04-07 01:12:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:41.506077 | orchestrator | 2026-04-07 01:12:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:41.506125 | orchestrator | 2026-04-07 01:12:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:44.548728 | orchestrator | 2026-04-07 01:12:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:44.551147 | orchestrator | 2026-04-07 01:12:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:44.551367 | orchestrator | 2026-04-07 01:12:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:47.599186 | orchestrator | 2026-04-07 01:12:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:47.601711 | orchestrator | 2026-04-07 01:12:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:47.601759 | orchestrator | 2026-04-07 01:12:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:50.646584 | orchestrator | 2026-04-07 01:12:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:50.647430 | orchestrator | 2026-04-07 01:12:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:50.647610 | orchestrator | 2026-04-07 01:12:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:53.694716 | orchestrator | 2026-04-07 01:12:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:53.696340 | orchestrator | 2026-04-07 01:12:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:53.696402 | orchestrator | 2026-04-07 01:12:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:56.739597 | orchestrator | 2026-04-07 01:12:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:56.740833 | orchestrator | 2026-04-07 01:12:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:56.740872 | orchestrator | 2026-04-07 01:12:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:12:59.791343 | orchestrator | 2026-04-07 01:12:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:12:59.793211 | orchestrator | 2026-04-07 01:12:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:12:59.793275 | orchestrator | 2026-04-07 01:12:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:02.846572 | orchestrator | 2026-04-07 01:13:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:02.848408 | orchestrator | 2026-04-07 01:13:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:02.848470 | orchestrator | 2026-04-07 01:13:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:05.900639 | orchestrator | 2026-04-07 01:13:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:05.903440 | orchestrator | 2026-04-07 01:13:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:05.903499 | orchestrator | 2026-04-07 01:13:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:08.952227 | orchestrator | 2026-04-07 01:13:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:08.953348 | orchestrator | 2026-04-07 01:13:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:08.953391 | orchestrator | 2026-04-07 01:13:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:12.000789 | orchestrator | 2026-04-07 01:13:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:12.002934 | orchestrator | 2026-04-07 01:13:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:12.003036 | orchestrator | 2026-04-07 01:13:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:15.046214 | orchestrator | 2026-04-07 01:13:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:15.049092 | orchestrator | 2026-04-07 01:13:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:15.049149 | orchestrator | 2026-04-07 01:13:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:18.096136 | orchestrator | 2026-04-07 01:13:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:18.098833 | orchestrator | 2026-04-07 01:13:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:18.098918 | orchestrator | 2026-04-07 01:13:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:21.145424 | orchestrator | 2026-04-07 01:13:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:21.146610 | orchestrator | 2026-04-07 01:13:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:21.146899 | orchestrator | 2026-04-07 01:13:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:24.192691 | orchestrator | 2026-04-07 01:13:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:24.194071 | orchestrator | 2026-04-07 01:13:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:24.194128 | orchestrator | 2026-04-07 01:13:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:27.230917 | orchestrator | 2026-04-07 01:13:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:27.232619 | orchestrator | 2026-04-07 01:13:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:27.232727 | orchestrator | 2026-04-07 01:13:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:30.282397 | orchestrator | 2026-04-07 01:13:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:30.283131 | orchestrator | 2026-04-07 01:13:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:30.283245 | orchestrator | 2026-04-07 01:13:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:33.333845 | orchestrator | 2026-04-07 01:13:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:33.335078 | orchestrator | 2026-04-07 01:13:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:33.335135 | orchestrator | 2026-04-07 01:13:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:36.381099 | orchestrator | 2026-04-07 01:13:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:36.383240 | orchestrator | 2026-04-07 01:13:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:36.383368 | orchestrator | 2026-04-07 01:13:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:39.433025 | orchestrator | 2026-04-07 01:13:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:39.435819 | orchestrator | 2026-04-07 01:13:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:39.435899 | orchestrator | 2026-04-07 01:13:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:42.481067 | orchestrator | 2026-04-07 01:13:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:42.482599 | orchestrator | 2026-04-07 01:13:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:42.482669 | orchestrator | 2026-04-07 01:13:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:45.528541 | orchestrator | 2026-04-07 01:13:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:45.530151 | orchestrator | 2026-04-07 01:13:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:45.530209 | orchestrator | 2026-04-07 01:13:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:48.570750 | orchestrator | 2026-04-07 01:13:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:48.572056 | orchestrator | 2026-04-07 01:13:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:48.572135 | orchestrator | 2026-04-07 01:13:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:51.614462 | orchestrator | 2026-04-07 01:13:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:51.615929 | orchestrator | 2026-04-07 01:13:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:51.616013 | orchestrator | 2026-04-07 01:13:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:54.660960 | orchestrator | 2026-04-07 01:13:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:54.662455 | orchestrator | 2026-04-07 01:13:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:54.662521 | orchestrator | 2026-04-07 01:13:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:13:57.710432 | orchestrator | 2026-04-07 01:13:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:13:57.712117 | orchestrator | 2026-04-07 01:13:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:13:57.712289 | orchestrator | 2026-04-07 01:13:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:00.762564 | orchestrator | 2026-04-07 01:14:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:00.763834 | orchestrator | 2026-04-07 01:14:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:00.763856 | orchestrator | 2026-04-07 01:14:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:03.813494 | orchestrator | 2026-04-07 01:14:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:03.816534 | orchestrator | 2026-04-07 01:14:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:03.816605 | orchestrator | 2026-04-07 01:14:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:06.863449 | orchestrator | 2026-04-07 01:14:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:06.865327 | orchestrator | 2026-04-07 01:14:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:06.865368 | orchestrator | 2026-04-07 01:14:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:09.919703 | orchestrator | 2026-04-07 01:14:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:09.922667 | orchestrator | 2026-04-07 01:14:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:09.922736 | orchestrator | 2026-04-07 01:14:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:12.968227 | orchestrator | 2026-04-07 01:14:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:12.970146 | orchestrator | 2026-04-07 01:14:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:12.970212 | orchestrator | 2026-04-07 01:14:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:16.018772 | orchestrator | 2026-04-07 01:14:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:16.020078 | orchestrator | 2026-04-07 01:14:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:16.020144 | orchestrator | 2026-04-07 01:14:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:19.063377 | orchestrator | 2026-04-07 01:14:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:19.065403 | orchestrator | 2026-04-07 01:14:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:19.065475 | orchestrator | 2026-04-07 01:14:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:22.110738 | orchestrator | 2026-04-07 01:14:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:22.112842 | orchestrator | 2026-04-07 01:14:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:22.112900 | orchestrator | 2026-04-07 01:14:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:25.167442 | orchestrator | 2026-04-07 01:14:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:25.168417 | orchestrator | 2026-04-07 01:14:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:25.168656 | orchestrator | 2026-04-07 01:14:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:28.208924 | orchestrator | 2026-04-07 01:14:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:28.211223 | orchestrator | 2026-04-07 01:14:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:28.211332 | orchestrator | 2026-04-07 01:14:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:31.256979 | orchestrator | 2026-04-07 01:14:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:31.257657 | orchestrator | 2026-04-07 01:14:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:31.257844 | orchestrator | 2026-04-07 01:14:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:34.304629 | orchestrator | 2026-04-07 01:14:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:34.307488 | orchestrator | 2026-04-07 01:14:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:34.307568 | orchestrator | 2026-04-07 01:14:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:37.350627 | orchestrator | 2026-04-07 01:14:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:37.352034 | orchestrator | 2026-04-07 01:14:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:37.352145 | orchestrator | 2026-04-07 01:14:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:40.405389 | orchestrator | 2026-04-07 01:14:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:40.406848 | orchestrator | 2026-04-07 01:14:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:40.407035 | orchestrator | 2026-04-07 01:14:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:43.451234 | orchestrator | 2026-04-07 01:14:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:43.453026 | orchestrator | 2026-04-07 01:14:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:43.453073 | orchestrator | 2026-04-07 01:14:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:46.495740 | orchestrator | 2026-04-07 01:14:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:46.497454 | orchestrator | 2026-04-07 01:14:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:46.497538 | orchestrator | 2026-04-07 01:14:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:49.537736 | orchestrator | 2026-04-07 01:14:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:49.538940 | orchestrator | 2026-04-07 01:14:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:49.539061 | orchestrator | 2026-04-07 01:14:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:52.581116 | orchestrator | 2026-04-07 01:14:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:52.583523 | orchestrator | 2026-04-07 01:14:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:52.583581 | orchestrator | 2026-04-07 01:14:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:55.631638 | orchestrator | 2026-04-07 01:14:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:55.632361 | orchestrator | 2026-04-07 01:14:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:55.632413 | orchestrator | 2026-04-07 01:14:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:14:58.683229 | orchestrator | 2026-04-07 01:14:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:14:58.683311 | orchestrator | 2026-04-07 01:14:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:14:58.683317 | orchestrator | 2026-04-07 01:14:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:01.740026 | orchestrator | 2026-04-07 01:15:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:01.740655 | orchestrator | 2026-04-07 01:15:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:01.740741 | orchestrator | 2026-04-07 01:15:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:04.797730 | orchestrator | 2026-04-07 01:15:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:04.799557 | orchestrator | 2026-04-07 01:15:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:04.799643 | orchestrator | 2026-04-07 01:15:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:07.842238 | orchestrator | 2026-04-07 01:15:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:07.843837 | orchestrator | 2026-04-07 01:15:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:07.843935 | orchestrator | 2026-04-07 01:15:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:10.893704 | orchestrator | 2026-04-07 01:15:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:10.895593 | orchestrator | 2026-04-07 01:15:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:10.896038 | orchestrator | 2026-04-07 01:15:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:13.934487 | orchestrator | 2026-04-07 01:15:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:13.937507 | orchestrator | 2026-04-07 01:15:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:13.937551 | orchestrator | 2026-04-07 01:15:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:16.990161 | orchestrator | 2026-04-07 01:15:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:16.991868 | orchestrator | 2026-04-07 01:15:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:16.991906 | orchestrator | 2026-04-07 01:15:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:20.040498 | orchestrator | 2026-04-07 01:15:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:20.042118 | orchestrator | 2026-04-07 01:15:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:20.042173 | orchestrator | 2026-04-07 01:15:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:23.090525 | orchestrator | 2026-04-07 01:15:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:23.091081 | orchestrator | 2026-04-07 01:15:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:23.091178 | orchestrator | 2026-04-07 01:15:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:26.140183 | orchestrator | 2026-04-07 01:15:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:26.142800 | orchestrator | 2026-04-07 01:15:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:26.144366 | orchestrator | 2026-04-07 01:15:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:29.185515 | orchestrator | 2026-04-07 01:15:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:29.187496 | orchestrator | 2026-04-07 01:15:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:29.187625 | orchestrator | 2026-04-07 01:15:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:32.238198 | orchestrator | 2026-04-07 01:15:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:32.239872 | orchestrator | 2026-04-07 01:15:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:32.239983 | orchestrator | 2026-04-07 01:15:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:35.303041 | orchestrator | 2026-04-07 01:15:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:35.305699 | orchestrator | 2026-04-07 01:15:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:35.305800 | orchestrator | 2026-04-07 01:15:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:38.355013 | orchestrator | 2026-04-07 01:15:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:38.355826 | orchestrator | 2026-04-07 01:15:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:38.355881 | orchestrator | 2026-04-07 01:15:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:41.405728 | orchestrator | 2026-04-07 01:15:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:41.407722 | orchestrator | 2026-04-07 01:15:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:41.407776 | orchestrator | 2026-04-07 01:15:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:44.453671 | orchestrator | 2026-04-07 01:15:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:44.455221 | orchestrator | 2026-04-07 01:15:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:44.455390 | orchestrator | 2026-04-07 01:15:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:47.502606 | orchestrator | 2026-04-07 01:15:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:47.504219 | orchestrator | 2026-04-07 01:15:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:47.504299 | orchestrator | 2026-04-07 01:15:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:50.549400 | orchestrator | 2026-04-07 01:15:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:50.550794 | orchestrator | 2026-04-07 01:15:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:50.550851 | orchestrator | 2026-04-07 01:15:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:53.592163 | orchestrator | 2026-04-07 01:15:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:53.593397 | orchestrator | 2026-04-07 01:15:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:53.593460 | orchestrator | 2026-04-07 01:15:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:56.632216 | orchestrator | 2026-04-07 01:15:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:56.633797 | orchestrator | 2026-04-07 01:15:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:56.633856 | orchestrator | 2026-04-07 01:15:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:15:59.674651 | orchestrator | 2026-04-07 01:15:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:15:59.674993 | orchestrator | 2026-04-07 01:15:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:15:59.675265 | orchestrator | 2026-04-07 01:15:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:02.727022 | orchestrator | 2026-04-07 01:16:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:02.728450 | orchestrator | 2026-04-07 01:16:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:02.728571 | orchestrator | 2026-04-07 01:16:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:05.773344 | orchestrator | 2026-04-07 01:16:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:05.775464 | orchestrator | 2026-04-07 01:16:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:05.775596 | orchestrator | 2026-04-07 01:16:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:08.822140 | orchestrator | 2026-04-07 01:16:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:08.823335 | orchestrator | 2026-04-07 01:16:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:08.823397 | orchestrator | 2026-04-07 01:16:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:11.868010 | orchestrator | 2026-04-07 01:16:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:11.869701 | orchestrator | 2026-04-07 01:16:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:11.869802 | orchestrator | 2026-04-07 01:16:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:14.910786 | orchestrator | 2026-04-07 01:16:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:14.912350 | orchestrator | 2026-04-07 01:16:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:14.912385 | orchestrator | 2026-04-07 01:16:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:17.957321 | orchestrator | 2026-04-07 01:16:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:17.960597 | orchestrator | 2026-04-07 01:16:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:17.960699 | orchestrator | 2026-04-07 01:16:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:21.014431 | orchestrator | 2026-04-07 01:16:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:21.017689 | orchestrator | 2026-04-07 01:16:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:21.017778 | orchestrator | 2026-04-07 01:16:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:24.064889 | orchestrator | 2026-04-07 01:16:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:24.066686 | orchestrator | 2026-04-07 01:16:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:24.066733 | orchestrator | 2026-04-07 01:16:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:27.124302 | orchestrator | 2026-04-07 01:16:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:27.125939 | orchestrator | 2026-04-07 01:16:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:27.125985 | orchestrator | 2026-04-07 01:16:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:30.168851 | orchestrator | 2026-04-07 01:16:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:30.170620 | orchestrator | 2026-04-07 01:16:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:30.170655 | orchestrator | 2026-04-07 01:16:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:33.211022 | orchestrator | 2026-04-07 01:16:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:33.212409 | orchestrator | 2026-04-07 01:16:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:33.212483 | orchestrator | 2026-04-07 01:16:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:36.255909 | orchestrator | 2026-04-07 01:16:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:36.257589 | orchestrator | 2026-04-07 01:16:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:36.257666 | orchestrator | 2026-04-07 01:16:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:39.300419 | orchestrator | 2026-04-07 01:16:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:39.301913 | orchestrator | 2026-04-07 01:16:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:39.301967 | orchestrator | 2026-04-07 01:16:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:42.349442 | orchestrator | 2026-04-07 01:16:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:42.350889 | orchestrator | 2026-04-07 01:16:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:42.351036 | orchestrator | 2026-04-07 01:16:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:45.397299 | orchestrator | 2026-04-07 01:16:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:45.399313 | orchestrator | 2026-04-07 01:16:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:45.399375 | orchestrator | 2026-04-07 01:16:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:48.442165 | orchestrator | 2026-04-07 01:16:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:48.443683 | orchestrator | 2026-04-07 01:16:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:48.443735 | orchestrator | 2026-04-07 01:16:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:51.488105 | orchestrator | 2026-04-07 01:16:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:51.490345 | orchestrator | 2026-04-07 01:16:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:51.490422 | orchestrator | 2026-04-07 01:16:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:54.530929 | orchestrator | 2026-04-07 01:16:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:54.532560 | orchestrator | 2026-04-07 01:16:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:54.532676 | orchestrator | 2026-04-07 01:16:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:16:57.577118 | orchestrator | 2026-04-07 01:16:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:16:57.579227 | orchestrator | 2026-04-07 01:16:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:16:57.579342 | orchestrator | 2026-04-07 01:16:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:00.636790 | orchestrator | 2026-04-07 01:17:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:00.637204 | orchestrator | 2026-04-07 01:17:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:00.637230 | orchestrator | 2026-04-07 01:17:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:03.678893 | orchestrator | 2026-04-07 01:17:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:03.680452 | orchestrator | 2026-04-07 01:17:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:03.680500 | orchestrator | 2026-04-07 01:17:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:06.732428 | orchestrator | 2026-04-07 01:17:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:06.734090 | orchestrator | 2026-04-07 01:17:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:06.734162 | orchestrator | 2026-04-07 01:17:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:09.778375 | orchestrator | 2026-04-07 01:17:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:09.780078 | orchestrator | 2026-04-07 01:17:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:09.780135 | orchestrator | 2026-04-07 01:17:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:12.833500 | orchestrator | 2026-04-07 01:17:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:12.836071 | orchestrator | 2026-04-07 01:17:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:12.836920 | orchestrator | 2026-04-07 01:17:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:15.874006 | orchestrator | 2026-04-07 01:17:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:15.875437 | orchestrator | 2026-04-07 01:17:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:15.875489 | orchestrator | 2026-04-07 01:17:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:18.921782 | orchestrator | 2026-04-07 01:17:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:18.924044 | orchestrator | 2026-04-07 01:17:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:18.924188 | orchestrator | 2026-04-07 01:17:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:21.978668 | orchestrator | 2026-04-07 01:17:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:21.980809 | orchestrator | 2026-04-07 01:17:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:21.980885 | orchestrator | 2026-04-07 01:17:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:25.026398 | orchestrator | 2026-04-07 01:17:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:25.028192 | orchestrator | 2026-04-07 01:17:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:25.028316 | orchestrator | 2026-04-07 01:17:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:28.088664 | orchestrator | 2026-04-07 01:17:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:28.090556 | orchestrator | 2026-04-07 01:17:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:28.090690 | orchestrator | 2026-04-07 01:17:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:31.132662 | orchestrator | 2026-04-07 01:17:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:31.134504 | orchestrator | 2026-04-07 01:17:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:31.134573 | orchestrator | 2026-04-07 01:17:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:34.180645 | orchestrator | 2026-04-07 01:17:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:34.181447 | orchestrator | 2026-04-07 01:17:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:34.181491 | orchestrator | 2026-04-07 01:17:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:37.226601 | orchestrator | 2026-04-07 01:17:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:37.226953 | orchestrator | 2026-04-07 01:17:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:37.226975 | orchestrator | 2026-04-07 01:17:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:40.276090 | orchestrator | 2026-04-07 01:17:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:40.277715 | orchestrator | 2026-04-07 01:17:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:40.277775 | orchestrator | 2026-04-07 01:17:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:43.326721 | orchestrator | 2026-04-07 01:17:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:43.327779 | orchestrator | 2026-04-07 01:17:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:43.327827 | orchestrator | 2026-04-07 01:17:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:46.373083 | orchestrator | 2026-04-07 01:17:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:46.374113 | orchestrator | 2026-04-07 01:17:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:46.374167 | orchestrator | 2026-04-07 01:17:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:49.413572 | orchestrator | 2026-04-07 01:17:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:49.415081 | orchestrator | 2026-04-07 01:17:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:49.415132 | orchestrator | 2026-04-07 01:17:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:52.458528 | orchestrator | 2026-04-07 01:17:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:52.461130 | orchestrator | 2026-04-07 01:17:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:52.461272 | orchestrator | 2026-04-07 01:17:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:55.510379 | orchestrator | 2026-04-07 01:17:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:55.512292 | orchestrator | 2026-04-07 01:17:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:55.512352 | orchestrator | 2026-04-07 01:17:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:17:58.563480 | orchestrator | 2026-04-07 01:17:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:17:58.566180 | orchestrator | 2026-04-07 01:17:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:17:58.566287 | orchestrator | 2026-04-07 01:17:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:01.611528 | orchestrator | 2026-04-07 01:18:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:01.615205 | orchestrator | 2026-04-07 01:18:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:01.615323 | orchestrator | 2026-04-07 01:18:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:04.666997 | orchestrator | 2026-04-07 01:18:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:04.668963 | orchestrator | 2026-04-07 01:18:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:04.669118 | orchestrator | 2026-04-07 01:18:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:07.713852 | orchestrator | 2026-04-07 01:18:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:07.714950 | orchestrator | 2026-04-07 01:18:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:07.714993 | orchestrator | 2026-04-07 01:18:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:10.767601 | orchestrator | 2026-04-07 01:18:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:10.770377 | orchestrator | 2026-04-07 01:18:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:10.770507 | orchestrator | 2026-04-07 01:18:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:13.815312 | orchestrator | 2026-04-07 01:18:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:13.818964 | orchestrator | 2026-04-07 01:18:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:13.819058 | orchestrator | 2026-04-07 01:18:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:16.863061 | orchestrator | 2026-04-07 01:18:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:16.864528 | orchestrator | 2026-04-07 01:18:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:16.864701 | orchestrator | 2026-04-07 01:18:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:19.913871 | orchestrator | 2026-04-07 01:18:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:19.915584 | orchestrator | 2026-04-07 01:18:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:19.915654 | orchestrator | 2026-04-07 01:18:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:22.961775 | orchestrator | 2026-04-07 01:18:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:22.963562 | orchestrator | 2026-04-07 01:18:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:22.963618 | orchestrator | 2026-04-07 01:18:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:26.014084 | orchestrator | 2026-04-07 01:18:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:26.014159 | orchestrator | 2026-04-07 01:18:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:26.014195 | orchestrator | 2026-04-07 01:18:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:29.063538 | orchestrator | 2026-04-07 01:18:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:29.064555 | orchestrator | 2026-04-07 01:18:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:29.064595 | orchestrator | 2026-04-07 01:18:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:32.112595 | orchestrator | 2026-04-07 01:18:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:32.114509 | orchestrator | 2026-04-07 01:18:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:32.114585 | orchestrator | 2026-04-07 01:18:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:35.159080 | orchestrator | 2026-04-07 01:18:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:35.161041 | orchestrator | 2026-04-07 01:18:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:35.161088 | orchestrator | 2026-04-07 01:18:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:38.205202 | orchestrator | 2026-04-07 01:18:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:38.206793 | orchestrator | 2026-04-07 01:18:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:38.206853 | orchestrator | 2026-04-07 01:18:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:41.257934 | orchestrator | 2026-04-07 01:18:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:41.259677 | orchestrator | 2026-04-07 01:18:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:41.259786 | orchestrator | 2026-04-07 01:18:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:44.307114 | orchestrator | 2026-04-07 01:18:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:44.309178 | orchestrator | 2026-04-07 01:18:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:44.309246 | orchestrator | 2026-04-07 01:18:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:47.348438 | orchestrator | 2026-04-07 01:18:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:47.351014 | orchestrator | 2026-04-07 01:18:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:47.351078 | orchestrator | 2026-04-07 01:18:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:50.399002 | orchestrator | 2026-04-07 01:18:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:50.400840 | orchestrator | 2026-04-07 01:18:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:50.401219 | orchestrator | 2026-04-07 01:18:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:53.450078 | orchestrator | 2026-04-07 01:18:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:53.452166 | orchestrator | 2026-04-07 01:18:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:53.452233 | orchestrator | 2026-04-07 01:18:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:56.496057 | orchestrator | 2026-04-07 01:18:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:56.498848 | orchestrator | 2026-04-07 01:18:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:56.498968 | orchestrator | 2026-04-07 01:18:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:18:59.544934 | orchestrator | 2026-04-07 01:18:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:18:59.545069 | orchestrator | 2026-04-07 01:18:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:18:59.545081 | orchestrator | 2026-04-07 01:18:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:02.587170 | orchestrator | 2026-04-07 01:19:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:02.587874 | orchestrator | 2026-04-07 01:19:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:02.587922 | orchestrator | 2026-04-07 01:19:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:05.637189 | orchestrator | 2026-04-07 01:19:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:05.639593 | orchestrator | 2026-04-07 01:19:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:05.639682 | orchestrator | 2026-04-07 01:19:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:08.682917 | orchestrator | 2026-04-07 01:19:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:08.683933 | orchestrator | 2026-04-07 01:19:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:08.683982 | orchestrator | 2026-04-07 01:19:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:11.726565 | orchestrator | 2026-04-07 01:19:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:11.726841 | orchestrator | 2026-04-07 01:19:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:11.726860 | orchestrator | 2026-04-07 01:19:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:14.774267 | orchestrator | 2026-04-07 01:19:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:14.774862 | orchestrator | 2026-04-07 01:19:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:14.774907 | orchestrator | 2026-04-07 01:19:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:17.820665 | orchestrator | 2026-04-07 01:19:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:17.821386 | orchestrator | 2026-04-07 01:19:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:17.821443 | orchestrator | 2026-04-07 01:19:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:20.863371 | orchestrator | 2026-04-07 01:19:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:20.864852 | orchestrator | 2026-04-07 01:19:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:20.864914 | orchestrator | 2026-04-07 01:19:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:23.910066 | orchestrator | 2026-04-07 01:19:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:23.911274 | orchestrator | 2026-04-07 01:19:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:23.911379 | orchestrator | 2026-04-07 01:19:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:26.959008 | orchestrator | 2026-04-07 01:19:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:26.961313 | orchestrator | 2026-04-07 01:19:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:26.961730 | orchestrator | 2026-04-07 01:19:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:30.005454 | orchestrator | 2026-04-07 01:19:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:30.009877 | orchestrator | 2026-04-07 01:19:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:30.010099 | orchestrator | 2026-04-07 01:19:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:33.053749 | orchestrator | 2026-04-07 01:19:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:33.054878 | orchestrator | 2026-04-07 01:19:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:33.054931 | orchestrator | 2026-04-07 01:19:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:36.096881 | orchestrator | 2026-04-07 01:19:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:36.099348 | orchestrator | 2026-04-07 01:19:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:36.099418 | orchestrator | 2026-04-07 01:19:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:39.139402 | orchestrator | 2026-04-07 01:19:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:39.140613 | orchestrator | 2026-04-07 01:19:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:39.140832 | orchestrator | 2026-04-07 01:19:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:42.175858 | orchestrator | 2026-04-07 01:19:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:42.177245 | orchestrator | 2026-04-07 01:19:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:42.177300 | orchestrator | 2026-04-07 01:19:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:45.222638 | orchestrator | 2026-04-07 01:19:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:45.225472 | orchestrator | 2026-04-07 01:19:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:45.225534 | orchestrator | 2026-04-07 01:19:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:48.278527 | orchestrator | 2026-04-07 01:19:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:48.278746 | orchestrator | 2026-04-07 01:19:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:48.278800 | orchestrator | 2026-04-07 01:19:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:51.325234 | orchestrator | 2026-04-07 01:19:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:51.326921 | orchestrator | 2026-04-07 01:19:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:51.326958 | orchestrator | 2026-04-07 01:19:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:54.375895 | orchestrator | 2026-04-07 01:19:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:54.377354 | orchestrator | 2026-04-07 01:19:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:54.377618 | orchestrator | 2026-04-07 01:19:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:19:57.427087 | orchestrator | 2026-04-07 01:19:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:19:57.428883 | orchestrator | 2026-04-07 01:19:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:19:57.429267 | orchestrator | 2026-04-07 01:19:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:00.487360 | orchestrator | 2026-04-07 01:20:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:00.487965 | orchestrator | 2026-04-07 01:20:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:00.488005 | orchestrator | 2026-04-07 01:20:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:03.543276 | orchestrator | 2026-04-07 01:20:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:03.547118 | orchestrator | 2026-04-07 01:20:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:03.547229 | orchestrator | 2026-04-07 01:20:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:06.601697 | orchestrator | 2026-04-07 01:20:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:06.602798 | orchestrator | 2026-04-07 01:20:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:06.602840 | orchestrator | 2026-04-07 01:20:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:09.648469 | orchestrator | 2026-04-07 01:20:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:09.651618 | orchestrator | 2026-04-07 01:20:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:09.651693 | orchestrator | 2026-04-07 01:20:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:12.701781 | orchestrator | 2026-04-07 01:20:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:12.918563 | orchestrator | 2026-04-07 01:20:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:12.918639 | orchestrator | 2026-04-07 01:20:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:15.757040 | orchestrator | 2026-04-07 01:20:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:15.758534 | orchestrator | 2026-04-07 01:20:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:15.758592 | orchestrator | 2026-04-07 01:20:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:18.811245 | orchestrator | 2026-04-07 01:20:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:18.813294 | orchestrator | 2026-04-07 01:20:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:18.813429 | orchestrator | 2026-04-07 01:20:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:21.853068 | orchestrator | 2026-04-07 01:20:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:21.854619 | orchestrator | 2026-04-07 01:20:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:21.854891 | orchestrator | 2026-04-07 01:20:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:24.911479 | orchestrator | 2026-04-07 01:20:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:24.911600 | orchestrator | 2026-04-07 01:20:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:24.911718 | orchestrator | 2026-04-07 01:20:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:27.959292 | orchestrator | 2026-04-07 01:20:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:27.960968 | orchestrator | 2026-04-07 01:20:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:27.961077 | orchestrator | 2026-04-07 01:20:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:31.011825 | orchestrator | 2026-04-07 01:20:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:31.013186 | orchestrator | 2026-04-07 01:20:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:31.013258 | orchestrator | 2026-04-07 01:20:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:34.077059 | orchestrator | 2026-04-07 01:20:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:34.077450 | orchestrator | 2026-04-07 01:20:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:34.077963 | orchestrator | 2026-04-07 01:20:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:37.127463 | orchestrator | 2026-04-07 01:20:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:37.129658 | orchestrator | 2026-04-07 01:20:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:37.129769 | orchestrator | 2026-04-07 01:20:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:40.170633 | orchestrator | 2026-04-07 01:20:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:40.171184 | orchestrator | 2026-04-07 01:20:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:40.171231 | orchestrator | 2026-04-07 01:20:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:43.223499 | orchestrator | 2026-04-07 01:20:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:43.224072 | orchestrator | 2026-04-07 01:20:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:43.224620 | orchestrator | 2026-04-07 01:20:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:46.269240 | orchestrator | 2026-04-07 01:20:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:46.269434 | orchestrator | 2026-04-07 01:20:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:46.269458 | orchestrator | 2026-04-07 01:20:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:49.319686 | orchestrator | 2026-04-07 01:20:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:49.320995 | orchestrator | 2026-04-07 01:20:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:49.321195 | orchestrator | 2026-04-07 01:20:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:52.360624 | orchestrator | 2026-04-07 01:20:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:52.361784 | orchestrator | 2026-04-07 01:20:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:52.361851 | orchestrator | 2026-04-07 01:20:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:55.405074 | orchestrator | 2026-04-07 01:20:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:55.407186 | orchestrator | 2026-04-07 01:20:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:55.407465 | orchestrator | 2026-04-07 01:20:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:20:58.458415 | orchestrator | 2026-04-07 01:20:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:20:58.459983 | orchestrator | 2026-04-07 01:20:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:20:58.460049 | orchestrator | 2026-04-07 01:20:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:01.509114 | orchestrator | 2026-04-07 01:21:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:01.510584 | orchestrator | 2026-04-07 01:21:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:01.510639 | orchestrator | 2026-04-07 01:21:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:04.559422 | orchestrator | 2026-04-07 01:21:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:04.559650 | orchestrator | 2026-04-07 01:21:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:04.559905 | orchestrator | 2026-04-07 01:21:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:07.603510 | orchestrator | 2026-04-07 01:21:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:07.604969 | orchestrator | 2026-04-07 01:21:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:07.605013 | orchestrator | 2026-04-07 01:21:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:10.653329 | orchestrator | 2026-04-07 01:21:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:10.655213 | orchestrator | 2026-04-07 01:21:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:10.655284 | orchestrator | 2026-04-07 01:21:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:13.697758 | orchestrator | 2026-04-07 01:21:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:13.699591 | orchestrator | 2026-04-07 01:21:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:13.699635 | orchestrator | 2026-04-07 01:21:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:16.746473 | orchestrator | 2026-04-07 01:21:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:16.747984 | orchestrator | 2026-04-07 01:21:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:16.748058 | orchestrator | 2026-04-07 01:21:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:19.792827 | orchestrator | 2026-04-07 01:21:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:19.794755 | orchestrator | 2026-04-07 01:21:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:19.794817 | orchestrator | 2026-04-07 01:21:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:22.837935 | orchestrator | 2026-04-07 01:21:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:22.840227 | orchestrator | 2026-04-07 01:21:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:22.840319 | orchestrator | 2026-04-07 01:21:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:25.899115 | orchestrator | 2026-04-07 01:21:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:25.899253 | orchestrator | 2026-04-07 01:21:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:25.899271 | orchestrator | 2026-04-07 01:21:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:28.950554 | orchestrator | 2026-04-07 01:21:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:28.952729 | orchestrator | 2026-04-07 01:21:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:28.952798 | orchestrator | 2026-04-07 01:21:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:32.005052 | orchestrator | 2026-04-07 01:21:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:32.008405 | orchestrator | 2026-04-07 01:21:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:32.008474 | orchestrator | 2026-04-07 01:21:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:35.060796 | orchestrator | 2026-04-07 01:21:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:35.062636 | orchestrator | 2026-04-07 01:21:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:35.062680 | orchestrator | 2026-04-07 01:21:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:38.104674 | orchestrator | 2026-04-07 01:21:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:38.106443 | orchestrator | 2026-04-07 01:21:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:38.106637 | orchestrator | 2026-04-07 01:21:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:41.152574 | orchestrator | 2026-04-07 01:21:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:41.153912 | orchestrator | 2026-04-07 01:21:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:41.153975 | orchestrator | 2026-04-07 01:21:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:44.200009 | orchestrator | 2026-04-07 01:21:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:44.201887 | orchestrator | 2026-04-07 01:21:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:44.201930 | orchestrator | 2026-04-07 01:21:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:47.245271 | orchestrator | 2026-04-07 01:21:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:47.246199 | orchestrator | 2026-04-07 01:21:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:47.246436 | orchestrator | 2026-04-07 01:21:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:50.300231 | orchestrator | 2026-04-07 01:21:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:50.301284 | orchestrator | 2026-04-07 01:21:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:50.301401 | orchestrator | 2026-04-07 01:21:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:53.348647 | orchestrator | 2026-04-07 01:21:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:53.349897 | orchestrator | 2026-04-07 01:21:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:53.350081 | orchestrator | 2026-04-07 01:21:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:56.388123 | orchestrator | 2026-04-07 01:21:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:56.388454 | orchestrator | 2026-04-07 01:21:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:56.388490 | orchestrator | 2026-04-07 01:21:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:21:59.433504 | orchestrator | 2026-04-07 01:21:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:21:59.435431 | orchestrator | 2026-04-07 01:21:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:21:59.435554 | orchestrator | 2026-04-07 01:21:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:02.479376 | orchestrator | 2026-04-07 01:22:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:02.481033 | orchestrator | 2026-04-07 01:22:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:02.481050 | orchestrator | 2026-04-07 01:22:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:05.532248 | orchestrator | 2026-04-07 01:22:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:05.534054 | orchestrator | 2026-04-07 01:22:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:05.534086 | orchestrator | 2026-04-07 01:22:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:08.579438 | orchestrator | 2026-04-07 01:22:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:08.580236 | orchestrator | 2026-04-07 01:22:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:08.580249 | orchestrator | 2026-04-07 01:22:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:11.621439 | orchestrator | 2026-04-07 01:22:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:11.623471 | orchestrator | 2026-04-07 01:22:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:11.623610 | orchestrator | 2026-04-07 01:22:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:14.672785 | orchestrator | 2026-04-07 01:22:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:14.674557 | orchestrator | 2026-04-07 01:22:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:14.674647 | orchestrator | 2026-04-07 01:22:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:17.726099 | orchestrator | 2026-04-07 01:22:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:17.727393 | orchestrator | 2026-04-07 01:22:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:17.727457 | orchestrator | 2026-04-07 01:22:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:20.782297 | orchestrator | 2026-04-07 01:22:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:20.785704 | orchestrator | 2026-04-07 01:22:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:20.785913 | orchestrator | 2026-04-07 01:22:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:23.827774 | orchestrator | 2026-04-07 01:22:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:23.830264 | orchestrator | 2026-04-07 01:22:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:23.830337 | orchestrator | 2026-04-07 01:22:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:26.872707 | orchestrator | 2026-04-07 01:22:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:26.873325 | orchestrator | 2026-04-07 01:22:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:26.873503 | orchestrator | 2026-04-07 01:22:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:29.917656 | orchestrator | 2026-04-07 01:22:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:29.920153 | orchestrator | 2026-04-07 01:22:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:29.920198 | orchestrator | 2026-04-07 01:22:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:32.966836 | orchestrator | 2026-04-07 01:22:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:32.968388 | orchestrator | 2026-04-07 01:22:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:32.968434 | orchestrator | 2026-04-07 01:22:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:36.017448 | orchestrator | 2026-04-07 01:22:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:36.018903 | orchestrator | 2026-04-07 01:22:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:36.018953 | orchestrator | 2026-04-07 01:22:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:39.070905 | orchestrator | 2026-04-07 01:22:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:39.072692 | orchestrator | 2026-04-07 01:22:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:39.072713 | orchestrator | 2026-04-07 01:22:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:42.123184 | orchestrator | 2026-04-07 01:22:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:42.125755 | orchestrator | 2026-04-07 01:22:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:42.125804 | orchestrator | 2026-04-07 01:22:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:45.175946 | orchestrator | 2026-04-07 01:22:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:45.177300 | orchestrator | 2026-04-07 01:22:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:45.177325 | orchestrator | 2026-04-07 01:22:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:48.222312 | orchestrator | 2026-04-07 01:22:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:48.261036 | orchestrator | 2026-04-07 01:22:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:48.261123 | orchestrator | 2026-04-07 01:22:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:51.277728 | orchestrator | 2026-04-07 01:22:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:51.279848 | orchestrator | 2026-04-07 01:22:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:51.279891 | orchestrator | 2026-04-07 01:22:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:54.326257 | orchestrator | 2026-04-07 01:22:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:54.328576 | orchestrator | 2026-04-07 01:22:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:54.328740 | orchestrator | 2026-04-07 01:22:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:22:57.373994 | orchestrator | 2026-04-07 01:22:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:22:57.375158 | orchestrator | 2026-04-07 01:22:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:22:57.375192 | orchestrator | 2026-04-07 01:22:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:00.423421 | orchestrator | 2026-04-07 01:23:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:00.425058 | orchestrator | 2026-04-07 01:23:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:00.425151 | orchestrator | 2026-04-07 01:23:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:03.473889 | orchestrator | 2026-04-07 01:23:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:03.474834 | orchestrator | 2026-04-07 01:23:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:03.474854 | orchestrator | 2026-04-07 01:23:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:06.530517 | orchestrator | 2026-04-07 01:23:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:06.533153 | orchestrator | 2026-04-07 01:23:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:06.533271 | orchestrator | 2026-04-07 01:23:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:09.580524 | orchestrator | 2026-04-07 01:23:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:09.582418 | orchestrator | 2026-04-07 01:23:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:09.582477 | orchestrator | 2026-04-07 01:23:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:12.626115 | orchestrator | 2026-04-07 01:23:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:12.627369 | orchestrator | 2026-04-07 01:23:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:12.627405 | orchestrator | 2026-04-07 01:23:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:15.678260 | orchestrator | 2026-04-07 01:23:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:15.680027 | orchestrator | 2026-04-07 01:23:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:15.680107 | orchestrator | 2026-04-07 01:23:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:18.731957 | orchestrator | 2026-04-07 01:23:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:18.806674 | orchestrator | 2026-04-07 01:23:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:18.806756 | orchestrator | 2026-04-07 01:23:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:21.778972 | orchestrator | 2026-04-07 01:23:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:21.780345 | orchestrator | 2026-04-07 01:23:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:21.780445 | orchestrator | 2026-04-07 01:23:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:24.831296 | orchestrator | 2026-04-07 01:23:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:24.833901 | orchestrator | 2026-04-07 01:23:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:24.833994 | orchestrator | 2026-04-07 01:23:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:27.877556 | orchestrator | 2026-04-07 01:23:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:27.882808 | orchestrator | 2026-04-07 01:23:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:27.882877 | orchestrator | 2026-04-07 01:23:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:30.926230 | orchestrator | 2026-04-07 01:23:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:30.928447 | orchestrator | 2026-04-07 01:23:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:30.928533 | orchestrator | 2026-04-07 01:23:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:33.975450 | orchestrator | 2026-04-07 01:23:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:33.979250 | orchestrator | 2026-04-07 01:23:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:33.979340 | orchestrator | 2026-04-07 01:23:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:37.027652 | orchestrator | 2026-04-07 01:23:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:37.030915 | orchestrator | 2026-04-07 01:23:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:37.030989 | orchestrator | 2026-04-07 01:23:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:40.081596 | orchestrator | 2026-04-07 01:23:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:40.082982 | orchestrator | 2026-04-07 01:23:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:40.083180 | orchestrator | 2026-04-07 01:23:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:43.135517 | orchestrator | 2026-04-07 01:23:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:43.136984 | orchestrator | 2026-04-07 01:23:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:43.137071 | orchestrator | 2026-04-07 01:23:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:46.189836 | orchestrator | 2026-04-07 01:23:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:46.191019 | orchestrator | 2026-04-07 01:23:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:46.191078 | orchestrator | 2026-04-07 01:23:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:49.235660 | orchestrator | 2026-04-07 01:23:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:49.237742 | orchestrator | 2026-04-07 01:23:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:49.237806 | orchestrator | 2026-04-07 01:23:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:52.289289 | orchestrator | 2026-04-07 01:23:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:52.291321 | orchestrator | 2026-04-07 01:23:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:52.292065 | orchestrator | 2026-04-07 01:23:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:55.350890 | orchestrator | 2026-04-07 01:23:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:55.352553 | orchestrator | 2026-04-07 01:23:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:55.352611 | orchestrator | 2026-04-07 01:23:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:23:58.394803 | orchestrator | 2026-04-07 01:23:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:23:58.396401 | orchestrator | 2026-04-07 01:23:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:23:58.396465 | orchestrator | 2026-04-07 01:23:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:01.443101 | orchestrator | 2026-04-07 01:24:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:01.446258 | orchestrator | 2026-04-07 01:24:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:01.446334 | orchestrator | 2026-04-07 01:24:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:04.488980 | orchestrator | 2026-04-07 01:24:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:04.490528 | orchestrator | 2026-04-07 01:24:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:04.491029 | orchestrator | 2026-04-07 01:24:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:07.536807 | orchestrator | 2026-04-07 01:24:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:07.537599 | orchestrator | 2026-04-07 01:24:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:07.537652 | orchestrator | 2026-04-07 01:24:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:10.588638 | orchestrator | 2026-04-07 01:24:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:10.590221 | orchestrator | 2026-04-07 01:24:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:10.590289 | orchestrator | 2026-04-07 01:24:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:13.642734 | orchestrator | 2026-04-07 01:24:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:13.644556 | orchestrator | 2026-04-07 01:24:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:13.644619 | orchestrator | 2026-04-07 01:24:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:16.683017 | orchestrator | 2026-04-07 01:24:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:16.683555 | orchestrator | 2026-04-07 01:24:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:16.683681 | orchestrator | 2026-04-07 01:24:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:19.732267 | orchestrator | 2026-04-07 01:24:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:19.734001 | orchestrator | 2026-04-07 01:24:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:19.734158 | orchestrator | 2026-04-07 01:24:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:22.780200 | orchestrator | 2026-04-07 01:24:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:22.781232 | orchestrator | 2026-04-07 01:24:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:22.781247 | orchestrator | 2026-04-07 01:24:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:25.820918 | orchestrator | 2026-04-07 01:24:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:25.823055 | orchestrator | 2026-04-07 01:24:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:25.823071 | orchestrator | 2026-04-07 01:24:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:28.865760 | orchestrator | 2026-04-07 01:24:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:28.865983 | orchestrator | 2026-04-07 01:24:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:28.866007 | orchestrator | 2026-04-07 01:24:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:31.911546 | orchestrator | 2026-04-07 01:24:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:31.912646 | orchestrator | 2026-04-07 01:24:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:31.912921 | orchestrator | 2026-04-07 01:24:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:34.960273 | orchestrator | 2026-04-07 01:24:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:34.961453 | orchestrator | 2026-04-07 01:24:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:34.961603 | orchestrator | 2026-04-07 01:24:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:38.005700 | orchestrator | 2026-04-07 01:24:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:38.008738 | orchestrator | 2026-04-07 01:24:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:38.008827 | orchestrator | 2026-04-07 01:24:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:41.055171 | orchestrator | 2026-04-07 01:24:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:41.056950 | orchestrator | 2026-04-07 01:24:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:41.057001 | orchestrator | 2026-04-07 01:24:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:44.098154 | orchestrator | 2026-04-07 01:24:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:44.099443 | orchestrator | 2026-04-07 01:24:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:44.099510 | orchestrator | 2026-04-07 01:24:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:47.147232 | orchestrator | 2026-04-07 01:24:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:47.148918 | orchestrator | 2026-04-07 01:24:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:47.148986 | orchestrator | 2026-04-07 01:24:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:50.205578 | orchestrator | 2026-04-07 01:24:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:50.207273 | orchestrator | 2026-04-07 01:24:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:50.207524 | orchestrator | 2026-04-07 01:24:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:53.257992 | orchestrator | 2026-04-07 01:24:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:53.259769 | orchestrator | 2026-04-07 01:24:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:53.259825 | orchestrator | 2026-04-07 01:24:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:56.302814 | orchestrator | 2026-04-07 01:24:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:56.305780 | orchestrator | 2026-04-07 01:24:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:56.305861 | orchestrator | 2026-04-07 01:24:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:24:59.362171 | orchestrator | 2026-04-07 01:24:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:24:59.364312 | orchestrator | 2026-04-07 01:24:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:24:59.364448 | orchestrator | 2026-04-07 01:24:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:02.408253 | orchestrator | 2026-04-07 01:25:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:02.409907 | orchestrator | 2026-04-07 01:25:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:02.410084 | orchestrator | 2026-04-07 01:25:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:05.459936 | orchestrator | 2026-04-07 01:25:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:05.461098 | orchestrator | 2026-04-07 01:25:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:05.461153 | orchestrator | 2026-04-07 01:25:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:08.511812 | orchestrator | 2026-04-07 01:25:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:08.514636 | orchestrator | 2026-04-07 01:25:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:08.514684 | orchestrator | 2026-04-07 01:25:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:11.567773 | orchestrator | 2026-04-07 01:25:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:11.570561 | orchestrator | 2026-04-07 01:25:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:11.570667 | orchestrator | 2026-04-07 01:25:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:14.617181 | orchestrator | 2026-04-07 01:25:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:14.619494 | orchestrator | 2026-04-07 01:25:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:14.619601 | orchestrator | 2026-04-07 01:25:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:17.664935 | orchestrator | 2026-04-07 01:25:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:17.666132 | orchestrator | 2026-04-07 01:25:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:17.666173 | orchestrator | 2026-04-07 01:25:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:20.719121 | orchestrator | 2026-04-07 01:25:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:20.721609 | orchestrator | 2026-04-07 01:25:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:20.721730 | orchestrator | 2026-04-07 01:25:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:23.770202 | orchestrator | 2026-04-07 01:25:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:23.772505 | orchestrator | 2026-04-07 01:25:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:23.772658 | orchestrator | 2026-04-07 01:25:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:26.816993 | orchestrator | 2026-04-07 01:25:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:26.819791 | orchestrator | 2026-04-07 01:25:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:26.819862 | orchestrator | 2026-04-07 01:25:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:29.867455 | orchestrator | 2026-04-07 01:25:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:29.868911 | orchestrator | 2026-04-07 01:25:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:29.868942 | orchestrator | 2026-04-07 01:25:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:32.916132 | orchestrator | 2026-04-07 01:25:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:32.918280 | orchestrator | 2026-04-07 01:25:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:32.918333 | orchestrator | 2026-04-07 01:25:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:35.962628 | orchestrator | 2026-04-07 01:25:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:35.965193 | orchestrator | 2026-04-07 01:25:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:35.965231 | orchestrator | 2026-04-07 01:25:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:39.005710 | orchestrator | 2026-04-07 01:25:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:39.006617 | orchestrator | 2026-04-07 01:25:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:39.006665 | orchestrator | 2026-04-07 01:25:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:42.056711 | orchestrator | 2026-04-07 01:25:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:42.058304 | orchestrator | 2026-04-07 01:25:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:42.058431 | orchestrator | 2026-04-07 01:25:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:45.105253 | orchestrator | 2026-04-07 01:25:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:45.106692 | orchestrator | 2026-04-07 01:25:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:45.106744 | orchestrator | 2026-04-07 01:25:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:48.150962 | orchestrator | 2026-04-07 01:25:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:48.152644 | orchestrator | 2026-04-07 01:25:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:48.152761 | orchestrator | 2026-04-07 01:25:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:51.201281 | orchestrator | 2026-04-07 01:25:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:51.203379 | orchestrator | 2026-04-07 01:25:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:51.203432 | orchestrator | 2026-04-07 01:25:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:54.242309 | orchestrator | 2026-04-07 01:25:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:54.242650 | orchestrator | 2026-04-07 01:25:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:54.242682 | orchestrator | 2026-04-07 01:25:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:25:57.296511 | orchestrator | 2026-04-07 01:25:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:25:57.298702 | orchestrator | 2026-04-07 01:25:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:25:57.298765 | orchestrator | 2026-04-07 01:25:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:00.341648 | orchestrator | 2026-04-07 01:26:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:00.343826 | orchestrator | 2026-04-07 01:26:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:00.343897 | orchestrator | 2026-04-07 01:26:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:03.388711 | orchestrator | 2026-04-07 01:26:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:03.390127 | orchestrator | 2026-04-07 01:26:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:03.390386 | orchestrator | 2026-04-07 01:26:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:06.441696 | orchestrator | 2026-04-07 01:26:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:06.443463 | orchestrator | 2026-04-07 01:26:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:06.443514 | orchestrator | 2026-04-07 01:26:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:09.491179 | orchestrator | 2026-04-07 01:26:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:09.492668 | orchestrator | 2026-04-07 01:26:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:09.492779 | orchestrator | 2026-04-07 01:26:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:12.541092 | orchestrator | 2026-04-07 01:26:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:12.542819 | orchestrator | 2026-04-07 01:26:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:12.542931 | orchestrator | 2026-04-07 01:26:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:15.603310 | orchestrator | 2026-04-07 01:26:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:15.604002 | orchestrator | 2026-04-07 01:26:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:15.604084 | orchestrator | 2026-04-07 01:26:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:18.652958 | orchestrator | 2026-04-07 01:26:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:18.654786 | orchestrator | 2026-04-07 01:26:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:18.654870 | orchestrator | 2026-04-07 01:26:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:21.704030 | orchestrator | 2026-04-07 01:26:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:21.706222 | orchestrator | 2026-04-07 01:26:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:21.706280 | orchestrator | 2026-04-07 01:26:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:24.747578 | orchestrator | 2026-04-07 01:26:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:24.749698 | orchestrator | 2026-04-07 01:26:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:24.749824 | orchestrator | 2026-04-07 01:26:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:27.800497 | orchestrator | 2026-04-07 01:26:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:27.802191 | orchestrator | 2026-04-07 01:26:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:27.802234 | orchestrator | 2026-04-07 01:26:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:30.855143 | orchestrator | 2026-04-07 01:26:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:30.858278 | orchestrator | 2026-04-07 01:26:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:30.858345 | orchestrator | 2026-04-07 01:26:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:33.901524 | orchestrator | 2026-04-07 01:26:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:33.902515 | orchestrator | 2026-04-07 01:26:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:33.902685 | orchestrator | 2026-04-07 01:26:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:36.948583 | orchestrator | 2026-04-07 01:26:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:36.951484 | orchestrator | 2026-04-07 01:26:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:36.951568 | orchestrator | 2026-04-07 01:26:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:40.003565 | orchestrator | 2026-04-07 01:26:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:40.005685 | orchestrator | 2026-04-07 01:26:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:40.005756 | orchestrator | 2026-04-07 01:26:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:43.051718 | orchestrator | 2026-04-07 01:26:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:43.053784 | orchestrator | 2026-04-07 01:26:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:43.053863 | orchestrator | 2026-04-07 01:26:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:46.099530 | orchestrator | 2026-04-07 01:26:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:46.100430 | orchestrator | 2026-04-07 01:26:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:46.100638 | orchestrator | 2026-04-07 01:26:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:49.151679 | orchestrator | 2026-04-07 01:26:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:49.153142 | orchestrator | 2026-04-07 01:26:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:49.153361 | orchestrator | 2026-04-07 01:26:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:52.212997 | orchestrator | 2026-04-07 01:26:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:52.215361 | orchestrator | 2026-04-07 01:26:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:52.215473 | orchestrator | 2026-04-07 01:26:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:55.262954 | orchestrator | 2026-04-07 01:26:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:55.265001 | orchestrator | 2026-04-07 01:26:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:55.265056 | orchestrator | 2026-04-07 01:26:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:26:58.308624 | orchestrator | 2026-04-07 01:26:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:26:58.310434 | orchestrator | 2026-04-07 01:26:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:26:58.310519 | orchestrator | 2026-04-07 01:26:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:01.357078 | orchestrator | 2026-04-07 01:27:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:01.359416 | orchestrator | 2026-04-07 01:27:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:01.359471 | orchestrator | 2026-04-07 01:27:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:04.411038 | orchestrator | 2026-04-07 01:27:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:04.412355 | orchestrator | 2026-04-07 01:27:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:04.412715 | orchestrator | 2026-04-07 01:27:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:07.460463 | orchestrator | 2026-04-07 01:27:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:07.461276 | orchestrator | 2026-04-07 01:27:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:07.461324 | orchestrator | 2026-04-07 01:27:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:10.507351 | orchestrator | 2026-04-07 01:27:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:10.509869 | orchestrator | 2026-04-07 01:27:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:10.510154 | orchestrator | 2026-04-07 01:27:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:13.556720 | orchestrator | 2026-04-07 01:27:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:13.558471 | orchestrator | 2026-04-07 01:27:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:13.558559 | orchestrator | 2026-04-07 01:27:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:16.608342 | orchestrator | 2026-04-07 01:27:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:16.610194 | orchestrator | 2026-04-07 01:27:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:16.610318 | orchestrator | 2026-04-07 01:27:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:19.657943 | orchestrator | 2026-04-07 01:27:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:19.659019 | orchestrator | 2026-04-07 01:27:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:19.659079 | orchestrator | 2026-04-07 01:27:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:22.708490 | orchestrator | 2026-04-07 01:27:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:22.712182 | orchestrator | 2026-04-07 01:27:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:22.712283 | orchestrator | 2026-04-07 01:27:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:25.757819 | orchestrator | 2026-04-07 01:27:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:25.761614 | orchestrator | 2026-04-07 01:27:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:25.761680 | orchestrator | 2026-04-07 01:27:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:28.811284 | orchestrator | 2026-04-07 01:27:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:28.811740 | orchestrator | 2026-04-07 01:27:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:28.811786 | orchestrator | 2026-04-07 01:27:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:31.862064 | orchestrator | 2026-04-07 01:27:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:31.863653 | orchestrator | 2026-04-07 01:27:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:31.863708 | orchestrator | 2026-04-07 01:27:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:34.909942 | orchestrator | 2026-04-07 01:27:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:34.911691 | orchestrator | 2026-04-07 01:27:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:34.911766 | orchestrator | 2026-04-07 01:27:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:37.957970 | orchestrator | 2026-04-07 01:27:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:37.961348 | orchestrator | 2026-04-07 01:27:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:37.961490 | orchestrator | 2026-04-07 01:27:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:41.006367 | orchestrator | 2026-04-07 01:27:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:41.009347 | orchestrator | 2026-04-07 01:27:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:41.009469 | orchestrator | 2026-04-07 01:27:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:44.055766 | orchestrator | 2026-04-07 01:27:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:44.057258 | orchestrator | 2026-04-07 01:27:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:44.057338 | orchestrator | 2026-04-07 01:27:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:47.104898 | orchestrator | 2026-04-07 01:27:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:47.108155 | orchestrator | 2026-04-07 01:27:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:47.108242 | orchestrator | 2026-04-07 01:27:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:50.156333 | orchestrator | 2026-04-07 01:27:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:50.157891 | orchestrator | 2026-04-07 01:27:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:50.158112 | orchestrator | 2026-04-07 01:27:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:53.206005 | orchestrator | 2026-04-07 01:27:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:53.207928 | orchestrator | 2026-04-07 01:27:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:53.208021 | orchestrator | 2026-04-07 01:27:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:56.250960 | orchestrator | 2026-04-07 01:27:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:56.252077 | orchestrator | 2026-04-07 01:27:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:56.252131 | orchestrator | 2026-04-07 01:27:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:27:59.297710 | orchestrator | 2026-04-07 01:27:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:27:59.299692 | orchestrator | 2026-04-07 01:27:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:27:59.299778 | orchestrator | 2026-04-07 01:27:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:02.352615 | orchestrator | 2026-04-07 01:28:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:02.354277 | orchestrator | 2026-04-07 01:28:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:02.354336 | orchestrator | 2026-04-07 01:28:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:05.404227 | orchestrator | 2026-04-07 01:28:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:05.406548 | orchestrator | 2026-04-07 01:28:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:05.406631 | orchestrator | 2026-04-07 01:28:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:08.457649 | orchestrator | 2026-04-07 01:28:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:08.459360 | orchestrator | 2026-04-07 01:28:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:08.459521 | orchestrator | 2026-04-07 01:28:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:11.507995 | orchestrator | 2026-04-07 01:28:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:11.509097 | orchestrator | 2026-04-07 01:28:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:11.509487 | orchestrator | 2026-04-07 01:28:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:14.558659 | orchestrator | 2026-04-07 01:28:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:14.560352 | orchestrator | 2026-04-07 01:28:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:14.560435 | orchestrator | 2026-04-07 01:28:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:17.614909 | orchestrator | 2026-04-07 01:28:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:17.617540 | orchestrator | 2026-04-07 01:28:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:17.617655 | orchestrator | 2026-04-07 01:28:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:20.662350 | orchestrator | 2026-04-07 01:28:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:20.663618 | orchestrator | 2026-04-07 01:28:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:20.663697 | orchestrator | 2026-04-07 01:28:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:23.713583 | orchestrator | 2026-04-07 01:28:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:23.714716 | orchestrator | 2026-04-07 01:28:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:23.714829 | orchestrator | 2026-04-07 01:28:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:26.769764 | orchestrator | 2026-04-07 01:28:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:26.770849 | orchestrator | 2026-04-07 01:28:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:26.770939 | orchestrator | 2026-04-07 01:28:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:29.822162 | orchestrator | 2026-04-07 01:28:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:29.823867 | orchestrator | 2026-04-07 01:28:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:29.823922 | orchestrator | 2026-04-07 01:28:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:32.869387 | orchestrator | 2026-04-07 01:28:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:32.872383 | orchestrator | 2026-04-07 01:28:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:32.872543 | orchestrator | 2026-04-07 01:28:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:35.916190 | orchestrator | 2026-04-07 01:28:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:35.917580 | orchestrator | 2026-04-07 01:28:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:35.917636 | orchestrator | 2026-04-07 01:28:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:38.958051 | orchestrator | 2026-04-07 01:28:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:38.960236 | orchestrator | 2026-04-07 01:28:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:38.960445 | orchestrator | 2026-04-07 01:28:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:42.002758 | orchestrator | 2026-04-07 01:28:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:42.004234 | orchestrator | 2026-04-07 01:28:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:42.004283 | orchestrator | 2026-04-07 01:28:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:45.054212 | orchestrator | 2026-04-07 01:28:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:45.056645 | orchestrator | 2026-04-07 01:28:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:45.056738 | orchestrator | 2026-04-07 01:28:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:48.104251 | orchestrator | 2026-04-07 01:28:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:48.105389 | orchestrator | 2026-04-07 01:28:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:48.105469 | orchestrator | 2026-04-07 01:28:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:51.154633 | orchestrator | 2026-04-07 01:28:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:51.155851 | orchestrator | 2026-04-07 01:28:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:51.155892 | orchestrator | 2026-04-07 01:28:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:54.199996 | orchestrator | 2026-04-07 01:28:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:54.201747 | orchestrator | 2026-04-07 01:28:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:54.201864 | orchestrator | 2026-04-07 01:28:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:28:57.246832 | orchestrator | 2026-04-07 01:28:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:28:57.249235 | orchestrator | 2026-04-07 01:28:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:28:57.249289 | orchestrator | 2026-04-07 01:28:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:00.294833 | orchestrator | 2026-04-07 01:29:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:00.295994 | orchestrator | 2026-04-07 01:29:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:00.296240 | orchestrator | 2026-04-07 01:29:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:03.352025 | orchestrator | 2026-04-07 01:29:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:03.353072 | orchestrator | 2026-04-07 01:29:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:03.353352 | orchestrator | 2026-04-07 01:29:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:06.400704 | orchestrator | 2026-04-07 01:29:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:06.401987 | orchestrator | 2026-04-07 01:29:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:06.402086 | orchestrator | 2026-04-07 01:29:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:09.451148 | orchestrator | 2026-04-07 01:29:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:09.452711 | orchestrator | 2026-04-07 01:29:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:09.452750 | orchestrator | 2026-04-07 01:29:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:12.497864 | orchestrator | 2026-04-07 01:29:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:12.499647 | orchestrator | 2026-04-07 01:29:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:12.499708 | orchestrator | 2026-04-07 01:29:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:15.546309 | orchestrator | 2026-04-07 01:29:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:15.547930 | orchestrator | 2026-04-07 01:29:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:15.548092 | orchestrator | 2026-04-07 01:29:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:18.595760 | orchestrator | 2026-04-07 01:29:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:18.596953 | orchestrator | 2026-04-07 01:29:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:18.596988 | orchestrator | 2026-04-07 01:29:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:21.645376 | orchestrator | 2026-04-07 01:29:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:21.647641 | orchestrator | 2026-04-07 01:29:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:21.647689 | orchestrator | 2026-04-07 01:29:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:24.690266 | orchestrator | 2026-04-07 01:29:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:24.691500 | orchestrator | 2026-04-07 01:29:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:24.691584 | orchestrator | 2026-04-07 01:29:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:27.737602 | orchestrator | 2026-04-07 01:29:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:27.738934 | orchestrator | 2026-04-07 01:29:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:27.739051 | orchestrator | 2026-04-07 01:29:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:30.786399 | orchestrator | 2026-04-07 01:29:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:30.788317 | orchestrator | 2026-04-07 01:29:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:30.788371 | orchestrator | 2026-04-07 01:29:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:33.832199 | orchestrator | 2026-04-07 01:29:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:33.834521 | orchestrator | 2026-04-07 01:29:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:33.834651 | orchestrator | 2026-04-07 01:29:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:36.880740 | orchestrator | 2026-04-07 01:29:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:36.882177 | orchestrator | 2026-04-07 01:29:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:36.882326 | orchestrator | 2026-04-07 01:29:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:39.922790 | orchestrator | 2026-04-07 01:29:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:39.928215 | orchestrator | 2026-04-07 01:29:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:39.929047 | orchestrator | 2026-04-07 01:29:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:42.977973 | orchestrator | 2026-04-07 01:29:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:42.979368 | orchestrator | 2026-04-07 01:29:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:42.979578 | orchestrator | 2026-04-07 01:29:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:46.027721 | orchestrator | 2026-04-07 01:29:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:46.028105 | orchestrator | 2026-04-07 01:29:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:46.028161 | orchestrator | 2026-04-07 01:29:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:49.078646 | orchestrator | 2026-04-07 01:29:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:49.079631 | orchestrator | 2026-04-07 01:29:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:49.079671 | orchestrator | 2026-04-07 01:29:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:52.119888 | orchestrator | 2026-04-07 01:29:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:52.120909 | orchestrator | 2026-04-07 01:29:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:52.120966 | orchestrator | 2026-04-07 01:29:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:55.167117 | orchestrator | 2026-04-07 01:29:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:55.169000 | orchestrator | 2026-04-07 01:29:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:55.169168 | orchestrator | 2026-04-07 01:29:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:29:58.219134 | orchestrator | 2026-04-07 01:29:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:29:58.220515 | orchestrator | 2026-04-07 01:29:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:29:58.220552 | orchestrator | 2026-04-07 01:29:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:01.273240 | orchestrator | 2026-04-07 01:30:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:01.275574 | orchestrator | 2026-04-07 01:30:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:01.275637 | orchestrator | 2026-04-07 01:30:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:04.324180 | orchestrator | 2026-04-07 01:30:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:04.325453 | orchestrator | 2026-04-07 01:30:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:04.325778 | orchestrator | 2026-04-07 01:30:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:07.368137 | orchestrator | 2026-04-07 01:30:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:07.369771 | orchestrator | 2026-04-07 01:30:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:07.369829 | orchestrator | 2026-04-07 01:30:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:10.414195 | orchestrator | 2026-04-07 01:30:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:10.415073 | orchestrator | 2026-04-07 01:30:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:10.415120 | orchestrator | 2026-04-07 01:30:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:13.455225 | orchestrator | 2026-04-07 01:30:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:13.456272 | orchestrator | 2026-04-07 01:30:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:13.456311 | orchestrator | 2026-04-07 01:30:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:16.512147 | orchestrator | 2026-04-07 01:30:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:16.512275 | orchestrator | 2026-04-07 01:30:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:16.512292 | orchestrator | 2026-04-07 01:30:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:19.575619 | orchestrator | 2026-04-07 01:30:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:19.575736 | orchestrator | 2026-04-07 01:30:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:19.575762 | orchestrator | 2026-04-07 01:30:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:22.625230 | orchestrator | 2026-04-07 01:30:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:22.627705 | orchestrator | 2026-04-07 01:30:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:22.627796 | orchestrator | 2026-04-07 01:30:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:25.675598 | orchestrator | 2026-04-07 01:30:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:25.676642 | orchestrator | 2026-04-07 01:30:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:25.676989 | orchestrator | 2026-04-07 01:30:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:28.734972 | orchestrator | 2026-04-07 01:30:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:28.736425 | orchestrator | 2026-04-07 01:30:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:28.736460 | orchestrator | 2026-04-07 01:30:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:31.788605 | orchestrator | 2026-04-07 01:30:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:31.790342 | orchestrator | 2026-04-07 01:30:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:31.790417 | orchestrator | 2026-04-07 01:30:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:34.847697 | orchestrator | 2026-04-07 01:30:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:34.848072 | orchestrator | 2026-04-07 01:30:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:34.848102 | orchestrator | 2026-04-07 01:30:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:37.897469 | orchestrator | 2026-04-07 01:30:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:37.899413 | orchestrator | 2026-04-07 01:30:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:37.899476 | orchestrator | 2026-04-07 01:30:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:40.942574 | orchestrator | 2026-04-07 01:30:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:40.943353 | orchestrator | 2026-04-07 01:30:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:40.943401 | orchestrator | 2026-04-07 01:30:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:43.989061 | orchestrator | 2026-04-07 01:30:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:43.990432 | orchestrator | 2026-04-07 01:30:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:43.990470 | orchestrator | 2026-04-07 01:30:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:47.047359 | orchestrator | 2026-04-07 01:30:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:47.049549 | orchestrator | 2026-04-07 01:30:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:47.049595 | orchestrator | 2026-04-07 01:30:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:50.095430 | orchestrator | 2026-04-07 01:30:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:50.096715 | orchestrator | 2026-04-07 01:30:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:50.096909 | orchestrator | 2026-04-07 01:30:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:53.141393 | orchestrator | 2026-04-07 01:30:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:53.142452 | orchestrator | 2026-04-07 01:30:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:53.142596 | orchestrator | 2026-04-07 01:30:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:56.186381 | orchestrator | 2026-04-07 01:30:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:56.187236 | orchestrator | 2026-04-07 01:30:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:56.187366 | orchestrator | 2026-04-07 01:30:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:30:59.233596 | orchestrator | 2026-04-07 01:30:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:30:59.238925 | orchestrator | 2026-04-07 01:30:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:30:59.238974 | orchestrator | 2026-04-07 01:30:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:02.293663 | orchestrator | 2026-04-07 01:31:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:02.295020 | orchestrator | 2026-04-07 01:31:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:02.295065 | orchestrator | 2026-04-07 01:31:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:05.350903 | orchestrator | 2026-04-07 01:31:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:05.352255 | orchestrator | 2026-04-07 01:31:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:05.352399 | orchestrator | 2026-04-07 01:31:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:08.403543 | orchestrator | 2026-04-07 01:31:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:08.404812 | orchestrator | 2026-04-07 01:31:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:08.404950 | orchestrator | 2026-04-07 01:31:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:11.449557 | orchestrator | 2026-04-07 01:31:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:11.450053 | orchestrator | 2026-04-07 01:31:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:11.450190 | orchestrator | 2026-04-07 01:31:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:14.497917 | orchestrator | 2026-04-07 01:31:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:14.499063 | orchestrator | 2026-04-07 01:31:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:14.499138 | orchestrator | 2026-04-07 01:31:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:17.552370 | orchestrator | 2026-04-07 01:31:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:17.553834 | orchestrator | 2026-04-07 01:31:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:17.553883 | orchestrator | 2026-04-07 01:31:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:20.607295 | orchestrator | 2026-04-07 01:31:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:20.609104 | orchestrator | 2026-04-07 01:31:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:20.609171 | orchestrator | 2026-04-07 01:31:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:23.655706 | orchestrator | 2026-04-07 01:31:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:23.657234 | orchestrator | 2026-04-07 01:31:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:23.657342 | orchestrator | 2026-04-07 01:31:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:26.708392 | orchestrator | 2026-04-07 01:31:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:26.708953 | orchestrator | 2026-04-07 01:31:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:26.709034 | orchestrator | 2026-04-07 01:31:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:29.754898 | orchestrator | 2026-04-07 01:31:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:29.755245 | orchestrator | 2026-04-07 01:31:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:29.755275 | orchestrator | 2026-04-07 01:31:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:32.801168 | orchestrator | 2026-04-07 01:31:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:32.802011 | orchestrator | 2026-04-07 01:31:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:32.802252 | orchestrator | 2026-04-07 01:31:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:35.844756 | orchestrator | 2026-04-07 01:31:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:35.845375 | orchestrator | 2026-04-07 01:31:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:35.845554 | orchestrator | 2026-04-07 01:31:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:38.891310 | orchestrator | 2026-04-07 01:31:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:38.892371 | orchestrator | 2026-04-07 01:31:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:38.892414 | orchestrator | 2026-04-07 01:31:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:41.936059 | orchestrator | 2026-04-07 01:31:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:41.938602 | orchestrator | 2026-04-07 01:31:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:41.938758 | orchestrator | 2026-04-07 01:31:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:44.982682 | orchestrator | 2026-04-07 01:31:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:44.984036 | orchestrator | 2026-04-07 01:31:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:44.984164 | orchestrator | 2026-04-07 01:31:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:48.040340 | orchestrator | 2026-04-07 01:31:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:48.042068 | orchestrator | 2026-04-07 01:31:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:48.042126 | orchestrator | 2026-04-07 01:31:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:51.097124 | orchestrator | 2026-04-07 01:31:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:51.097689 | orchestrator | 2026-04-07 01:31:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:51.097704 | orchestrator | 2026-04-07 01:31:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:54.143623 | orchestrator | 2026-04-07 01:31:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:54.145671 | orchestrator | 2026-04-07 01:31:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:54.145685 | orchestrator | 2026-04-07 01:31:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:31:57.184925 | orchestrator | 2026-04-07 01:31:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:31:57.186901 | orchestrator | 2026-04-07 01:31:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:31:57.187060 | orchestrator | 2026-04-07 01:31:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:00.230798 | orchestrator | 2026-04-07 01:32:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:00.233171 | orchestrator | 2026-04-07 01:32:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:00.233624 | orchestrator | 2026-04-07 01:32:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:03.278970 | orchestrator | 2026-04-07 01:32:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:03.282468 | orchestrator | 2026-04-07 01:32:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:03.282583 | orchestrator | 2026-04-07 01:32:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:06.335832 | orchestrator | 2026-04-07 01:32:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:06.338656 | orchestrator | 2026-04-07 01:32:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:06.338735 | orchestrator | 2026-04-07 01:32:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:09.390269 | orchestrator | 2026-04-07 01:32:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:09.393473 | orchestrator | 2026-04-07 01:32:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:09.393564 | orchestrator | 2026-04-07 01:32:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:12.443470 | orchestrator | 2026-04-07 01:32:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:12.445687 | orchestrator | 2026-04-07 01:32:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:12.445757 | orchestrator | 2026-04-07 01:32:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:15.495438 | orchestrator | 2026-04-07 01:32:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:15.496766 | orchestrator | 2026-04-07 01:32:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:15.496842 | orchestrator | 2026-04-07 01:32:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:18.551062 | orchestrator | 2026-04-07 01:32:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:18.554491 | orchestrator | 2026-04-07 01:32:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:18.554609 | orchestrator | 2026-04-07 01:32:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:21.605687 | orchestrator | 2026-04-07 01:32:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:21.609036 | orchestrator | 2026-04-07 01:32:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:21.609118 | orchestrator | 2026-04-07 01:32:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:24.656848 | orchestrator | 2026-04-07 01:32:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:24.658294 | orchestrator | 2026-04-07 01:32:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:24.658339 | orchestrator | 2026-04-07 01:32:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:27.707345 | orchestrator | 2026-04-07 01:32:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:27.709473 | orchestrator | 2026-04-07 01:32:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:27.709673 | orchestrator | 2026-04-07 01:32:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:30.759844 | orchestrator | 2026-04-07 01:32:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:30.761226 | orchestrator | 2026-04-07 01:32:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:30.761268 | orchestrator | 2026-04-07 01:32:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:33.812202 | orchestrator | 2026-04-07 01:32:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:33.813411 | orchestrator | 2026-04-07 01:32:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:33.813469 | orchestrator | 2026-04-07 01:32:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:36.858273 | orchestrator | 2026-04-07 01:32:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:36.858933 | orchestrator | 2026-04-07 01:32:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:36.858985 | orchestrator | 2026-04-07 01:32:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:39.904238 | orchestrator | 2026-04-07 01:32:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:39.906734 | orchestrator | 2026-04-07 01:32:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:39.906803 | orchestrator | 2026-04-07 01:32:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:42.958510 | orchestrator | 2026-04-07 01:32:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:42.960292 | orchestrator | 2026-04-07 01:32:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:42.960440 | orchestrator | 2026-04-07 01:32:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:46.014483 | orchestrator | 2026-04-07 01:32:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:46.016748 | orchestrator | 2026-04-07 01:32:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:46.017377 | orchestrator | 2026-04-07 01:32:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:49.065585 | orchestrator | 2026-04-07 01:32:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:49.066778 | orchestrator | 2026-04-07 01:32:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:49.066895 | orchestrator | 2026-04-07 01:32:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:52.110340 | orchestrator | 2026-04-07 01:32:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:52.111489 | orchestrator | 2026-04-07 01:32:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:52.111504 | orchestrator | 2026-04-07 01:32:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:55.164974 | orchestrator | 2026-04-07 01:32:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:55.165959 | orchestrator | 2026-04-07 01:32:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:55.165994 | orchestrator | 2026-04-07 01:32:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:32:58.213680 | orchestrator | 2026-04-07 01:32:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:32:58.214866 | orchestrator | 2026-04-07 01:32:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:32:58.215031 | orchestrator | 2026-04-07 01:32:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:01.268744 | orchestrator | 2026-04-07 01:33:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:01.271153 | orchestrator | 2026-04-07 01:33:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:01.271216 | orchestrator | 2026-04-07 01:33:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:04.318494 | orchestrator | 2026-04-07 01:33:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:04.320723 | orchestrator | 2026-04-07 01:33:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:04.320956 | orchestrator | 2026-04-07 01:33:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:07.363649 | orchestrator | 2026-04-07 01:33:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:07.365578 | orchestrator | 2026-04-07 01:33:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:07.365658 | orchestrator | 2026-04-07 01:33:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:10.414698 | orchestrator | 2026-04-07 01:33:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:10.415004 | orchestrator | 2026-04-07 01:33:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:10.415030 | orchestrator | 2026-04-07 01:33:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:13.463328 | orchestrator | 2026-04-07 01:33:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:13.463744 | orchestrator | 2026-04-07 01:33:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:13.464026 | orchestrator | 2026-04-07 01:33:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:16.510234 | orchestrator | 2026-04-07 01:33:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:16.512697 | orchestrator | 2026-04-07 01:33:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:16.512861 | orchestrator | 2026-04-07 01:33:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:19.561000 | orchestrator | 2026-04-07 01:33:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:19.563344 | orchestrator | 2026-04-07 01:33:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:19.563425 | orchestrator | 2026-04-07 01:33:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:22.608092 | orchestrator | 2026-04-07 01:33:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:22.610431 | orchestrator | 2026-04-07 01:33:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:22.610512 | orchestrator | 2026-04-07 01:33:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:25.655360 | orchestrator | 2026-04-07 01:33:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:25.656640 | orchestrator | 2026-04-07 01:33:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:25.656757 | orchestrator | 2026-04-07 01:33:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:28.698046 | orchestrator | 2026-04-07 01:33:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:28.701183 | orchestrator | 2026-04-07 01:33:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:28.701316 | orchestrator | 2026-04-07 01:33:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:31.753520 | orchestrator | 2026-04-07 01:33:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:33:31.755505 | orchestrator | 2026-04-07 01:33:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:33:31.755581 | orchestrator | 2026-04-07 01:33:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:33:34.801136 | orchestrator | 2026-04-07 01:33:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:34.882919 | orchestrator | 2026-04-07 01:35:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:34.883048 | orchestrator | 2026-04-07 01:35:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:37.924936 | orchestrator | 2026-04-07 01:35:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:37.927751 | orchestrator | 2026-04-07 01:35:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:37.927816 | orchestrator | 2026-04-07 01:35:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:40.978874 | orchestrator | 2026-04-07 01:35:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:40.980255 | orchestrator | 2026-04-07 01:35:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:40.980403 | orchestrator | 2026-04-07 01:35:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:44.022451 | orchestrator | 2026-04-07 01:35:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:44.023987 | orchestrator | 2026-04-07 01:35:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:44.024049 | orchestrator | 2026-04-07 01:35:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:47.063397 | orchestrator | 2026-04-07 01:35:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:47.064908 | orchestrator | 2026-04-07 01:35:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:47.065036 | orchestrator | 2026-04-07 01:35:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:50.107731 | orchestrator | 2026-04-07 01:35:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:50.109437 | orchestrator | 2026-04-07 01:35:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:50.109621 | orchestrator | 2026-04-07 01:35:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:53.155886 | orchestrator | 2026-04-07 01:35:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:53.157432 | orchestrator | 2026-04-07 01:35:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:53.157468 | orchestrator | 2026-04-07 01:35:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:56.203997 | orchestrator | 2026-04-07 01:35:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:56.205499 | orchestrator | 2026-04-07 01:35:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:56.205589 | orchestrator | 2026-04-07 01:35:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:35:59.253192 | orchestrator | 2026-04-07 01:35:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:35:59.254713 | orchestrator | 2026-04-07 01:35:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:35:59.254768 | orchestrator | 2026-04-07 01:35:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:02.305556 | orchestrator | 2026-04-07 01:36:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:02.307970 | orchestrator | 2026-04-07 01:36:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:02.308058 | orchestrator | 2026-04-07 01:36:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:05.351381 | orchestrator | 2026-04-07 01:36:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:05.353027 | orchestrator | 2026-04-07 01:36:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:05.353067 | orchestrator | 2026-04-07 01:36:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:08.396935 | orchestrator | 2026-04-07 01:36:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:08.398724 | orchestrator | 2026-04-07 01:36:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:08.398835 | orchestrator | 2026-04-07 01:36:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:11.443427 | orchestrator | 2026-04-07 01:36:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:11.445805 | orchestrator | 2026-04-07 01:36:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:11.445982 | orchestrator | 2026-04-07 01:36:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:14.493775 | orchestrator | 2026-04-07 01:36:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:14.496811 | orchestrator | 2026-04-07 01:36:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:14.496893 | orchestrator | 2026-04-07 01:36:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:17.539790 | orchestrator | 2026-04-07 01:36:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:17.542503 | orchestrator | 2026-04-07 01:36:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:17.542685 | orchestrator | 2026-04-07 01:36:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:20.583846 | orchestrator | 2026-04-07 01:36:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:20.584105 | orchestrator | 2026-04-07 01:36:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:20.584140 | orchestrator | 2026-04-07 01:36:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:23.632891 | orchestrator | 2026-04-07 01:36:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:23.634311 | orchestrator | 2026-04-07 01:36:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:23.634373 | orchestrator | 2026-04-07 01:36:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:26.681192 | orchestrator | 2026-04-07 01:36:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:26.682874 | orchestrator | 2026-04-07 01:36:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:26.682992 | orchestrator | 2026-04-07 01:36:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:29.731117 | orchestrator | 2026-04-07 01:36:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:29.732179 | orchestrator | 2026-04-07 01:36:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:29.732293 | orchestrator | 2026-04-07 01:36:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:32.771688 | orchestrator | 2026-04-07 01:36:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:32.774114 | orchestrator | 2026-04-07 01:36:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:32.774196 | orchestrator | 2026-04-07 01:36:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:35.814644 | orchestrator | 2026-04-07 01:36:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:35.816766 | orchestrator | 2026-04-07 01:36:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:35.817000 | orchestrator | 2026-04-07 01:36:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:38.857366 | orchestrator | 2026-04-07 01:36:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:38.859532 | orchestrator | 2026-04-07 01:36:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:38.859621 | orchestrator | 2026-04-07 01:36:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:41.902780 | orchestrator | 2026-04-07 01:36:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:41.903026 | orchestrator | 2026-04-07 01:36:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:41.903095 | orchestrator | 2026-04-07 01:36:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:44.951887 | orchestrator | 2026-04-07 01:36:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:44.952020 | orchestrator | 2026-04-07 01:36:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:44.952046 | orchestrator | 2026-04-07 01:36:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:47.999949 | orchestrator | 2026-04-07 01:36:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:48.001656 | orchestrator | 2026-04-07 01:36:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:48.001703 | orchestrator | 2026-04-07 01:36:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:51.050803 | orchestrator | 2026-04-07 01:36:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:51.051892 | orchestrator | 2026-04-07 01:36:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:51.052256 | orchestrator | 2026-04-07 01:36:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:54.104055 | orchestrator | 2026-04-07 01:36:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:54.107246 | orchestrator | 2026-04-07 01:36:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:54.107497 | orchestrator | 2026-04-07 01:36:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:36:57.154958 | orchestrator | 2026-04-07 01:36:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:36:57.156329 | orchestrator | 2026-04-07 01:36:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:36:57.156458 | orchestrator | 2026-04-07 01:36:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:00.202370 | orchestrator | 2026-04-07 01:37:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:00.203913 | orchestrator | 2026-04-07 01:37:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:00.203979 | orchestrator | 2026-04-07 01:37:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:03.250214 | orchestrator | 2026-04-07 01:37:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:03.251821 | orchestrator | 2026-04-07 01:37:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:03.251887 | orchestrator | 2026-04-07 01:37:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:06.306896 | orchestrator | 2026-04-07 01:37:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:06.307315 | orchestrator | 2026-04-07 01:37:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:06.307415 | orchestrator | 2026-04-07 01:37:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:09.380170 | orchestrator | 2026-04-07 01:37:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:09.384466 | orchestrator | 2026-04-07 01:37:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:09.384539 | orchestrator | 2026-04-07 01:37:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:12.464639 | orchestrator | 2026-04-07 01:37:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:12.467030 | orchestrator | 2026-04-07 01:37:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:12.467242 | orchestrator | 2026-04-07 01:37:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:15.514397 | orchestrator | 2026-04-07 01:37:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:15.516234 | orchestrator | 2026-04-07 01:37:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:15.516305 | orchestrator | 2026-04-07 01:37:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:18.568525 | orchestrator | 2026-04-07 01:37:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:18.570633 | orchestrator | 2026-04-07 01:37:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:18.570701 | orchestrator | 2026-04-07 01:37:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:21.615058 | orchestrator | 2026-04-07 01:37:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:21.616589 | orchestrator | 2026-04-07 01:37:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:21.616664 | orchestrator | 2026-04-07 01:37:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:24.679229 | orchestrator | 2026-04-07 01:37:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:24.680720 | orchestrator | 2026-04-07 01:37:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:24.680760 | orchestrator | 2026-04-07 01:37:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:27.720089 | orchestrator | 2026-04-07 01:37:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:27.720856 | orchestrator | 2026-04-07 01:37:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:27.720950 | orchestrator | 2026-04-07 01:37:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:30.797386 | orchestrator | 2026-04-07 01:37:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:30.799158 | orchestrator | 2026-04-07 01:37:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:30.799208 | orchestrator | 2026-04-07 01:37:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:33.845388 | orchestrator | 2026-04-07 01:37:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:33.847312 | orchestrator | 2026-04-07 01:37:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:33.847379 | orchestrator | 2026-04-07 01:37:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:36.890284 | orchestrator | 2026-04-07 01:37:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:36.891663 | orchestrator | 2026-04-07 01:37:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:36.891791 | orchestrator | 2026-04-07 01:37:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:39.934503 | orchestrator | 2026-04-07 01:37:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:39.936218 | orchestrator | 2026-04-07 01:37:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:39.936464 | orchestrator | 2026-04-07 01:37:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:42.985025 | orchestrator | 2026-04-07 01:37:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:42.986654 | orchestrator | 2026-04-07 01:37:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:42.986728 | orchestrator | 2026-04-07 01:37:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:46.029407 | orchestrator | 2026-04-07 01:37:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:46.029520 | orchestrator | 2026-04-07 01:37:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:46.029529 | orchestrator | 2026-04-07 01:37:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:49.073942 | orchestrator | 2026-04-07 01:37:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:49.075208 | orchestrator | 2026-04-07 01:37:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:49.075246 | orchestrator | 2026-04-07 01:37:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:52.119868 | orchestrator | 2026-04-07 01:37:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:52.121909 | orchestrator | 2026-04-07 01:37:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:52.122292 | orchestrator | 2026-04-07 01:37:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:55.170238 | orchestrator | 2026-04-07 01:37:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:55.172106 | orchestrator | 2026-04-07 01:37:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:55.172210 | orchestrator | 2026-04-07 01:37:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:37:58.218200 | orchestrator | 2026-04-07 01:37:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:37:58.219441 | orchestrator | 2026-04-07 01:37:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:37:58.219471 | orchestrator | 2026-04-07 01:37:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:01.261196 | orchestrator | 2026-04-07 01:38:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:01.263345 | orchestrator | 2026-04-07 01:38:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:01.263468 | orchestrator | 2026-04-07 01:38:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:04.306278 | orchestrator | 2026-04-07 01:38:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:04.308296 | orchestrator | 2026-04-07 01:38:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:04.308741 | orchestrator | 2026-04-07 01:38:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:07.354811 | orchestrator | 2026-04-07 01:38:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:07.357420 | orchestrator | 2026-04-07 01:38:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:07.357487 | orchestrator | 2026-04-07 01:38:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:10.398997 | orchestrator | 2026-04-07 01:38:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:10.401228 | orchestrator | 2026-04-07 01:38:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:10.401310 | orchestrator | 2026-04-07 01:38:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:13.445934 | orchestrator | 2026-04-07 01:38:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:13.447954 | orchestrator | 2026-04-07 01:38:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:13.448051 | orchestrator | 2026-04-07 01:38:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:16.491111 | orchestrator | 2026-04-07 01:38:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:16.492119 | orchestrator | 2026-04-07 01:38:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:16.492260 | orchestrator | 2026-04-07 01:38:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:19.541227 | orchestrator | 2026-04-07 01:38:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:19.542777 | orchestrator | 2026-04-07 01:38:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:19.543199 | orchestrator | 2026-04-07 01:38:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:22.585291 | orchestrator | 2026-04-07 01:38:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:22.586685 | orchestrator | 2026-04-07 01:38:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:22.586759 | orchestrator | 2026-04-07 01:38:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:25.634163 | orchestrator | 2026-04-07 01:38:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:25.635599 | orchestrator | 2026-04-07 01:38:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:25.635732 | orchestrator | 2026-04-07 01:38:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:28.680172 | orchestrator | 2026-04-07 01:38:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:28.681755 | orchestrator | 2026-04-07 01:38:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:28.681808 | orchestrator | 2026-04-07 01:38:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:31.727906 | orchestrator | 2026-04-07 01:38:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:31.729032 | orchestrator | 2026-04-07 01:38:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:31.729048 | orchestrator | 2026-04-07 01:38:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:34.772672 | orchestrator | 2026-04-07 01:38:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:34.774408 | orchestrator | 2026-04-07 01:38:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:34.774474 | orchestrator | 2026-04-07 01:38:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:37.819697 | orchestrator | 2026-04-07 01:38:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:37.821280 | orchestrator | 2026-04-07 01:38:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:37.821326 | orchestrator | 2026-04-07 01:38:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:40.867058 | orchestrator | 2026-04-07 01:38:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:40.869120 | orchestrator | 2026-04-07 01:38:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:40.869158 | orchestrator | 2026-04-07 01:38:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:43.909436 | orchestrator | 2026-04-07 01:38:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:43.912171 | orchestrator | 2026-04-07 01:38:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:43.912386 | orchestrator | 2026-04-07 01:38:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:46.953680 | orchestrator | 2026-04-07 01:38:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:46.956232 | orchestrator | 2026-04-07 01:38:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:46.956298 | orchestrator | 2026-04-07 01:38:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:50.005941 | orchestrator | 2026-04-07 01:38:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:50.007792 | orchestrator | 2026-04-07 01:38:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:50.007881 | orchestrator | 2026-04-07 01:38:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:53.053910 | orchestrator | 2026-04-07 01:38:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:53.056716 | orchestrator | 2026-04-07 01:38:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:53.056792 | orchestrator | 2026-04-07 01:38:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:56.095580 | orchestrator | 2026-04-07 01:38:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:56.097638 | orchestrator | 2026-04-07 01:38:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:56.097766 | orchestrator | 2026-04-07 01:38:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:38:59.140764 | orchestrator | 2026-04-07 01:38:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:38:59.142594 | orchestrator | 2026-04-07 01:38:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:38:59.142661 | orchestrator | 2026-04-07 01:38:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:02.189950 | orchestrator | 2026-04-07 01:39:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:02.192319 | orchestrator | 2026-04-07 01:39:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:02.192393 | orchestrator | 2026-04-07 01:39:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:05.230486 | orchestrator | 2026-04-07 01:39:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:05.232004 | orchestrator | 2026-04-07 01:39:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:05.232136 | orchestrator | 2026-04-07 01:39:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:08.277057 | orchestrator | 2026-04-07 01:39:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:08.278808 | orchestrator | 2026-04-07 01:39:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:08.278862 | orchestrator | 2026-04-07 01:39:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:11.323000 | orchestrator | 2026-04-07 01:39:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:11.323940 | orchestrator | 2026-04-07 01:39:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:11.324019 | orchestrator | 2026-04-07 01:39:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:14.368532 | orchestrator | 2026-04-07 01:39:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:14.370630 | orchestrator | 2026-04-07 01:39:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:14.370711 | orchestrator | 2026-04-07 01:39:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:17.414104 | orchestrator | 2026-04-07 01:39:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:17.416340 | orchestrator | 2026-04-07 01:39:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:17.416506 | orchestrator | 2026-04-07 01:39:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:20.463396 | orchestrator | 2026-04-07 01:39:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:20.464897 | orchestrator | 2026-04-07 01:39:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:20.464936 | orchestrator | 2026-04-07 01:39:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:23.512140 | orchestrator | 2026-04-07 01:39:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:23.513789 | orchestrator | 2026-04-07 01:39:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:23.513848 | orchestrator | 2026-04-07 01:39:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:26.558202 | orchestrator | 2026-04-07 01:39:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:26.559688 | orchestrator | 2026-04-07 01:39:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:26.559740 | orchestrator | 2026-04-07 01:39:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:29.608505 | orchestrator | 2026-04-07 01:39:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:29.611102 | orchestrator | 2026-04-07 01:39:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:29.611236 | orchestrator | 2026-04-07 01:39:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:32.657013 | orchestrator | 2026-04-07 01:39:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:32.658858 | orchestrator | 2026-04-07 01:39:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:32.658911 | orchestrator | 2026-04-07 01:39:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:35.700336 | orchestrator | 2026-04-07 01:39:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:35.702871 | orchestrator | 2026-04-07 01:39:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:35.702949 | orchestrator | 2026-04-07 01:39:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:38.740238 | orchestrator | 2026-04-07 01:39:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:38.742508 | orchestrator | 2026-04-07 01:39:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:38.743080 | orchestrator | 2026-04-07 01:39:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:41.786354 | orchestrator | 2026-04-07 01:39:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:41.787747 | orchestrator | 2026-04-07 01:39:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:41.787786 | orchestrator | 2026-04-07 01:39:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:44.834095 | orchestrator | 2026-04-07 01:39:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:44.835097 | orchestrator | 2026-04-07 01:39:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:44.835136 | orchestrator | 2026-04-07 01:39:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:47.878131 | orchestrator | 2026-04-07 01:39:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:47.879616 | orchestrator | 2026-04-07 01:39:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:47.879684 | orchestrator | 2026-04-07 01:39:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:50.923053 | orchestrator | 2026-04-07 01:39:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:50.925533 | orchestrator | 2026-04-07 01:39:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:50.925623 | orchestrator | 2026-04-07 01:39:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:53.969849 | orchestrator | 2026-04-07 01:39:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:53.970998 | orchestrator | 2026-04-07 01:39:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:53.971067 | orchestrator | 2026-04-07 01:39:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:39:57.012748 | orchestrator | 2026-04-07 01:39:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:39:57.014218 | orchestrator | 2026-04-07 01:39:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:39:57.014255 | orchestrator | 2026-04-07 01:39:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:00.047453 | orchestrator | 2026-04-07 01:40:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:00.048270 | orchestrator | 2026-04-07 01:40:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:00.048370 | orchestrator | 2026-04-07 01:40:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:03.096653 | orchestrator | 2026-04-07 01:40:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:03.098289 | orchestrator | 2026-04-07 01:40:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:03.098347 | orchestrator | 2026-04-07 01:40:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:06.137372 | orchestrator | 2026-04-07 01:40:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:06.139337 | orchestrator | 2026-04-07 01:40:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:06.139402 | orchestrator | 2026-04-07 01:40:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:09.185229 | orchestrator | 2026-04-07 01:40:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:09.186551 | orchestrator | 2026-04-07 01:40:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:09.186686 | orchestrator | 2026-04-07 01:40:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:12.228925 | orchestrator | 2026-04-07 01:40:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:12.231101 | orchestrator | 2026-04-07 01:40:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:12.231170 | orchestrator | 2026-04-07 01:40:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:15.279863 | orchestrator | 2026-04-07 01:40:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:15.280854 | orchestrator | 2026-04-07 01:40:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:15.280915 | orchestrator | 2026-04-07 01:40:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:18.321789 | orchestrator | 2026-04-07 01:40:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:18.325368 | orchestrator | 2026-04-07 01:40:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:18.325439 | orchestrator | 2026-04-07 01:40:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:21.367924 | orchestrator | 2026-04-07 01:40:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:21.369562 | orchestrator | 2026-04-07 01:40:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:21.369639 | orchestrator | 2026-04-07 01:40:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:24.411887 | orchestrator | 2026-04-07 01:40:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:24.414875 | orchestrator | 2026-04-07 01:40:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:24.414967 | orchestrator | 2026-04-07 01:40:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:27.457442 | orchestrator | 2026-04-07 01:40:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:27.459333 | orchestrator | 2026-04-07 01:40:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:27.459389 | orchestrator | 2026-04-07 01:40:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:30.504402 | orchestrator | 2026-04-07 01:40:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:30.505771 | orchestrator | 2026-04-07 01:40:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:30.505859 | orchestrator | 2026-04-07 01:40:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:33.547917 | orchestrator | 2026-04-07 01:40:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:33.549622 | orchestrator | 2026-04-07 01:40:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:33.549748 | orchestrator | 2026-04-07 01:40:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:36.593813 | orchestrator | 2026-04-07 01:40:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:36.595951 | orchestrator | 2026-04-07 01:40:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:36.596014 | orchestrator | 2026-04-07 01:40:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:39.638923 | orchestrator | 2026-04-07 01:40:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:39.640389 | orchestrator | 2026-04-07 01:40:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:39.640430 | orchestrator | 2026-04-07 01:40:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:42.680479 | orchestrator | 2026-04-07 01:40:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:42.682163 | orchestrator | 2026-04-07 01:40:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:42.682223 | orchestrator | 2026-04-07 01:40:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:45.728876 | orchestrator | 2026-04-07 01:40:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:45.730449 | orchestrator | 2026-04-07 01:40:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:45.730499 | orchestrator | 2026-04-07 01:40:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:48.769141 | orchestrator | 2026-04-07 01:40:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:48.770798 | orchestrator | 2026-04-07 01:40:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:48.770848 | orchestrator | 2026-04-07 01:40:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:51.817364 | orchestrator | 2026-04-07 01:40:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:51.819627 | orchestrator | 2026-04-07 01:40:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:51.819695 | orchestrator | 2026-04-07 01:40:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:54.873746 | orchestrator | 2026-04-07 01:40:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:54.875977 | orchestrator | 2026-04-07 01:40:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:54.876020 | orchestrator | 2026-04-07 01:40:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:40:57.922167 | orchestrator | 2026-04-07 01:40:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:40:57.923846 | orchestrator | 2026-04-07 01:40:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:40:57.923903 | orchestrator | 2026-04-07 01:40:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:00.970778 | orchestrator | 2026-04-07 01:41:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:00.971288 | orchestrator | 2026-04-07 01:41:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:00.971537 | orchestrator | 2026-04-07 01:41:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:04.026705 | orchestrator | 2026-04-07 01:41:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:04.029868 | orchestrator | 2026-04-07 01:41:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:04.029914 | orchestrator | 2026-04-07 01:41:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:07.082710 | orchestrator | 2026-04-07 01:41:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:07.086116 | orchestrator | 2026-04-07 01:41:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:07.086215 | orchestrator | 2026-04-07 01:41:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:10.135997 | orchestrator | 2026-04-07 01:41:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:10.139396 | orchestrator | 2026-04-07 01:41:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:10.139480 | orchestrator | 2026-04-07 01:41:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:13.189142 | orchestrator | 2026-04-07 01:41:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:13.193835 | orchestrator | 2026-04-07 01:41:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:13.193931 | orchestrator | 2026-04-07 01:41:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:16.248148 | orchestrator | 2026-04-07 01:41:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:16.250312 | orchestrator | 2026-04-07 01:41:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:16.250387 | orchestrator | 2026-04-07 01:41:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:19.308181 | orchestrator | 2026-04-07 01:41:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:19.309878 | orchestrator | 2026-04-07 01:41:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:19.309918 | orchestrator | 2026-04-07 01:41:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:22.363082 | orchestrator | 2026-04-07 01:41:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:22.363299 | orchestrator | 2026-04-07 01:41:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:22.363320 | orchestrator | 2026-04-07 01:41:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:25.414733 | orchestrator | 2026-04-07 01:41:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:25.417201 | orchestrator | 2026-04-07 01:41:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:25.417500 | orchestrator | 2026-04-07 01:41:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:28.476378 | orchestrator | 2026-04-07 01:41:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:28.484265 | orchestrator | 2026-04-07 01:41:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:28.484367 | orchestrator | 2026-04-07 01:41:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:31.528893 | orchestrator | 2026-04-07 01:41:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:31.530938 | orchestrator | 2026-04-07 01:41:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:31.530978 | orchestrator | 2026-04-07 01:41:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:34.581714 | orchestrator | 2026-04-07 01:41:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:34.583730 | orchestrator | 2026-04-07 01:41:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:34.583772 | orchestrator | 2026-04-07 01:41:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:37.631042 | orchestrator | 2026-04-07 01:41:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:37.632119 | orchestrator | 2026-04-07 01:41:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:37.632182 | orchestrator | 2026-04-07 01:41:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:40.679673 | orchestrator | 2026-04-07 01:41:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:40.683464 | orchestrator | 2026-04-07 01:41:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:40.683558 | orchestrator | 2026-04-07 01:41:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:43.741183 | orchestrator | 2026-04-07 01:41:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:43.742908 | orchestrator | 2026-04-07 01:41:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:43.742986 | orchestrator | 2026-04-07 01:41:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:46.789159 | orchestrator | 2026-04-07 01:41:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:46.792984 | orchestrator | 2026-04-07 01:41:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:46.793083 | orchestrator | 2026-04-07 01:41:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:49.843832 | orchestrator | 2026-04-07 01:41:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:49.845528 | orchestrator | 2026-04-07 01:41:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:49.845609 | orchestrator | 2026-04-07 01:41:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:52.912872 | orchestrator | 2026-04-07 01:41:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:52.914534 | orchestrator | 2026-04-07 01:41:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:52.914629 | orchestrator | 2026-04-07 01:41:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:55.969728 | orchestrator | 2026-04-07 01:41:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:55.972186 | orchestrator | 2026-04-07 01:41:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:55.972251 | orchestrator | 2026-04-07 01:41:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:41:59.014658 | orchestrator | 2026-04-07 01:41:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:41:59.014893 | orchestrator | 2026-04-07 01:41:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:41:59.015134 | orchestrator | 2026-04-07 01:41:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:02.067034 | orchestrator | 2026-04-07 01:42:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:02.068783 | orchestrator | 2026-04-07 01:42:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:02.068865 | orchestrator | 2026-04-07 01:42:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:05.125930 | orchestrator | 2026-04-07 01:42:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:05.126878 | orchestrator | 2026-04-07 01:42:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:05.126903 | orchestrator | 2026-04-07 01:42:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:08.169677 | orchestrator | 2026-04-07 01:42:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:08.171275 | orchestrator | 2026-04-07 01:42:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:08.171321 | orchestrator | 2026-04-07 01:42:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:11.217974 | orchestrator | 2026-04-07 01:42:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:11.218853 | orchestrator | 2026-04-07 01:42:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:11.218959 | orchestrator | 2026-04-07 01:42:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:14.259825 | orchestrator | 2026-04-07 01:42:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:14.261819 | orchestrator | 2026-04-07 01:42:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:14.261883 | orchestrator | 2026-04-07 01:42:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:17.308066 | orchestrator | 2026-04-07 01:42:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:17.309751 | orchestrator | 2026-04-07 01:42:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:17.309872 | orchestrator | 2026-04-07 01:42:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:20.362097 | orchestrator | 2026-04-07 01:42:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:20.365156 | orchestrator | 2026-04-07 01:42:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:20.365251 | orchestrator | 2026-04-07 01:42:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:23.412814 | orchestrator | 2026-04-07 01:42:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:23.414827 | orchestrator | 2026-04-07 01:42:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:23.414888 | orchestrator | 2026-04-07 01:42:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:26.465826 | orchestrator | 2026-04-07 01:42:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:26.467954 | orchestrator | 2026-04-07 01:42:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:26.467994 | orchestrator | 2026-04-07 01:42:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:29.524368 | orchestrator | 2026-04-07 01:42:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:29.526388 | orchestrator | 2026-04-07 01:42:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:29.526499 | orchestrator | 2026-04-07 01:42:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:32.572157 | orchestrator | 2026-04-07 01:42:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:32.573033 | orchestrator | 2026-04-07 01:42:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:32.573151 | orchestrator | 2026-04-07 01:42:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:35.623806 | orchestrator | 2026-04-07 01:42:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:35.625242 | orchestrator | 2026-04-07 01:42:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:35.625332 | orchestrator | 2026-04-07 01:42:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:38.670117 | orchestrator | 2026-04-07 01:42:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:38.671743 | orchestrator | 2026-04-07 01:42:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:38.671968 | orchestrator | 2026-04-07 01:42:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:41.715017 | orchestrator | 2026-04-07 01:42:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:41.716138 | orchestrator | 2026-04-07 01:42:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:41.717375 | orchestrator | 2026-04-07 01:42:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:44.756171 | orchestrator | 2026-04-07 01:42:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:44.757040 | orchestrator | 2026-04-07 01:42:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:44.757103 | orchestrator | 2026-04-07 01:42:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:47.801538 | orchestrator | 2026-04-07 01:42:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:47.802679 | orchestrator | 2026-04-07 01:42:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:47.802745 | orchestrator | 2026-04-07 01:42:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:50.857132 | orchestrator | 2026-04-07 01:42:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:50.859351 | orchestrator | 2026-04-07 01:42:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:50.859424 | orchestrator | 2026-04-07 01:42:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:53.906759 | orchestrator | 2026-04-07 01:42:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:53.908474 | orchestrator | 2026-04-07 01:42:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:53.908533 | orchestrator | 2026-04-07 01:42:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:42:56.954735 | orchestrator | 2026-04-07 01:42:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:42:56.955445 | orchestrator | 2026-04-07 01:42:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:42:56.955805 | orchestrator | 2026-04-07 01:42:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:00.001250 | orchestrator | 2026-04-07 01:43:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:00.003800 | orchestrator | 2026-04-07 01:43:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:00.003954 | orchestrator | 2026-04-07 01:43:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:03.053200 | orchestrator | 2026-04-07 01:43:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:03.055068 | orchestrator | 2026-04-07 01:43:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:03.055119 | orchestrator | 2026-04-07 01:43:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:06.109068 | orchestrator | 2026-04-07 01:43:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:06.110967 | orchestrator | 2026-04-07 01:43:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:06.111155 | orchestrator | 2026-04-07 01:43:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:09.159406 | orchestrator | 2026-04-07 01:43:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:09.160731 | orchestrator | 2026-04-07 01:43:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:09.160910 | orchestrator | 2026-04-07 01:43:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:12.213885 | orchestrator | 2026-04-07 01:43:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:12.215996 | orchestrator | 2026-04-07 01:43:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:12.216110 | orchestrator | 2026-04-07 01:43:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:15.272907 | orchestrator | 2026-04-07 01:43:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:15.275386 | orchestrator | 2026-04-07 01:43:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:15.275564 | orchestrator | 2026-04-07 01:43:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:18.328928 | orchestrator | 2026-04-07 01:43:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:18.330902 | orchestrator | 2026-04-07 01:43:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:18.330976 | orchestrator | 2026-04-07 01:43:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:21.384075 | orchestrator | 2026-04-07 01:43:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:21.385800 | orchestrator | 2026-04-07 01:43:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:21.385851 | orchestrator | 2026-04-07 01:43:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:24.438656 | orchestrator | 2026-04-07 01:43:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:24.440686 | orchestrator | 2026-04-07 01:43:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:24.440756 | orchestrator | 2026-04-07 01:43:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:27.489808 | orchestrator | 2026-04-07 01:43:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:27.490742 | orchestrator | 2026-04-07 01:43:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:27.490781 | orchestrator | 2026-04-07 01:43:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:30.534582 | orchestrator | 2026-04-07 01:43:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:30.535681 | orchestrator | 2026-04-07 01:43:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:30.535733 | orchestrator | 2026-04-07 01:43:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:33.590269 | orchestrator | 2026-04-07 01:43:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:33.592503 | orchestrator | 2026-04-07 01:43:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:33.592559 | orchestrator | 2026-04-07 01:43:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:36.641537 | orchestrator | 2026-04-07 01:43:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:36.642998 | orchestrator | 2026-04-07 01:43:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:36.643080 | orchestrator | 2026-04-07 01:43:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:39.693439 | orchestrator | 2026-04-07 01:43:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:39.694495 | orchestrator | 2026-04-07 01:43:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:39.694555 | orchestrator | 2026-04-07 01:43:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:42.749323 | orchestrator | 2026-04-07 01:43:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:42.751208 | orchestrator | 2026-04-07 01:43:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:42.751272 | orchestrator | 2026-04-07 01:43:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:45.805180 | orchestrator | 2026-04-07 01:43:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:45.806199 | orchestrator | 2026-04-07 01:43:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:45.806256 | orchestrator | 2026-04-07 01:43:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:48.853926 | orchestrator | 2026-04-07 01:43:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:48.854996 | orchestrator | 2026-04-07 01:43:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:48.855064 | orchestrator | 2026-04-07 01:43:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:51.900159 | orchestrator | 2026-04-07 01:43:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:51.901200 | orchestrator | 2026-04-07 01:43:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:51.901230 | orchestrator | 2026-04-07 01:43:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:54.948050 | orchestrator | 2026-04-07 01:43:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:54.949074 | orchestrator | 2026-04-07 01:43:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:54.949118 | orchestrator | 2026-04-07 01:43:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:43:57.988909 | orchestrator | 2026-04-07 01:43:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:43:57.991217 | orchestrator | 2026-04-07 01:43:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:43:57.991266 | orchestrator | 2026-04-07 01:43:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:01.041202 | orchestrator | 2026-04-07 01:44:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:01.042778 | orchestrator | 2026-04-07 01:44:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:01.042838 | orchestrator | 2026-04-07 01:44:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:04.088018 | orchestrator | 2026-04-07 01:44:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:04.088506 | orchestrator | 2026-04-07 01:44:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:04.088543 | orchestrator | 2026-04-07 01:44:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:07.134193 | orchestrator | 2026-04-07 01:44:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:07.135390 | orchestrator | 2026-04-07 01:44:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:07.135451 | orchestrator | 2026-04-07 01:44:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:10.182520 | orchestrator | 2026-04-07 01:44:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:10.185329 | orchestrator | 2026-04-07 01:44:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:10.185499 | orchestrator | 2026-04-07 01:44:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:13.233142 | orchestrator | 2026-04-07 01:44:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:13.234074 | orchestrator | 2026-04-07 01:44:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:13.234107 | orchestrator | 2026-04-07 01:44:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:16.284947 | orchestrator | 2026-04-07 01:44:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:16.285590 | orchestrator | 2026-04-07 01:44:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:16.285645 | orchestrator | 2026-04-07 01:44:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:19.333916 | orchestrator | 2026-04-07 01:44:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:19.334928 | orchestrator | 2026-04-07 01:44:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:19.334983 | orchestrator | 2026-04-07 01:44:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:22.395115 | orchestrator | 2026-04-07 01:44:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:22.396895 | orchestrator | 2026-04-07 01:44:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:22.396964 | orchestrator | 2026-04-07 01:44:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:25.446508 | orchestrator | 2026-04-07 01:44:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:25.448735 | orchestrator | 2026-04-07 01:44:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:25.448809 | orchestrator | 2026-04-07 01:44:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:28.501889 | orchestrator | 2026-04-07 01:44:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:28.504114 | orchestrator | 2026-04-07 01:44:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:28.504194 | orchestrator | 2026-04-07 01:44:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:31.550124 | orchestrator | 2026-04-07 01:44:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:31.550254 | orchestrator | 2026-04-07 01:44:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:31.550278 | orchestrator | 2026-04-07 01:44:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:34.597144 | orchestrator | 2026-04-07 01:44:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:34.598485 | orchestrator | 2026-04-07 01:44:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:34.598574 | orchestrator | 2026-04-07 01:44:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:37.648485 | orchestrator | 2026-04-07 01:44:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:37.649809 | orchestrator | 2026-04-07 01:44:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:37.649862 | orchestrator | 2026-04-07 01:44:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:40.694216 | orchestrator | 2026-04-07 01:44:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:40.695545 | orchestrator | 2026-04-07 01:44:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:40.695591 | orchestrator | 2026-04-07 01:44:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:43.745219 | orchestrator | 2026-04-07 01:44:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:43.749088 | orchestrator | 2026-04-07 01:44:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:43.749192 | orchestrator | 2026-04-07 01:44:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:46.802933 | orchestrator | 2026-04-07 01:44:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:46.804689 | orchestrator | 2026-04-07 01:44:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:46.804814 | orchestrator | 2026-04-07 01:44:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:49.850912 | orchestrator | 2026-04-07 01:44:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:49.852001 | orchestrator | 2026-04-07 01:44:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:49.852072 | orchestrator | 2026-04-07 01:44:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:52.903099 | orchestrator | 2026-04-07 01:44:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:52.903587 | orchestrator | 2026-04-07 01:44:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:52.903676 | orchestrator | 2026-04-07 01:44:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:55.954282 | orchestrator | 2026-04-07 01:44:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:55.955512 | orchestrator | 2026-04-07 01:44:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:55.955779 | orchestrator | 2026-04-07 01:44:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:44:59.006687 | orchestrator | 2026-04-07 01:44:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:44:59.009207 | orchestrator | 2026-04-07 01:44:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:44:59.009276 | orchestrator | 2026-04-07 01:44:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:02.052693 | orchestrator | 2026-04-07 01:45:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:02.054110 | orchestrator | 2026-04-07 01:45:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:02.054167 | orchestrator | 2026-04-07 01:45:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:05.105959 | orchestrator | 2026-04-07 01:45:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:05.107226 | orchestrator | 2026-04-07 01:45:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:05.107296 | orchestrator | 2026-04-07 01:45:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:08.150253 | orchestrator | 2026-04-07 01:45:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:08.151230 | orchestrator | 2026-04-07 01:45:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:08.151325 | orchestrator | 2026-04-07 01:45:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:11.203144 | orchestrator | 2026-04-07 01:45:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:11.204759 | orchestrator | 2026-04-07 01:45:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:11.204967 | orchestrator | 2026-04-07 01:45:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:14.250795 | orchestrator | 2026-04-07 01:45:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:14.251969 | orchestrator | 2026-04-07 01:45:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:14.252111 | orchestrator | 2026-04-07 01:45:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:17.298521 | orchestrator | 2026-04-07 01:45:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:17.299928 | orchestrator | 2026-04-07 01:45:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:17.299971 | orchestrator | 2026-04-07 01:45:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:20.341379 | orchestrator | 2026-04-07 01:45:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:20.342464 | orchestrator | 2026-04-07 01:45:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:20.342517 | orchestrator | 2026-04-07 01:45:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:23.388975 | orchestrator | 2026-04-07 01:45:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:23.390642 | orchestrator | 2026-04-07 01:45:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:23.390698 | orchestrator | 2026-04-07 01:45:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:26.445885 | orchestrator | 2026-04-07 01:45:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:26.448804 | orchestrator | 2026-04-07 01:45:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:26.448864 | orchestrator | 2026-04-07 01:45:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:29.495599 | orchestrator | 2026-04-07 01:45:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:29.496829 | orchestrator | 2026-04-07 01:45:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:29.496881 | orchestrator | 2026-04-07 01:45:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:32.543355 | orchestrator | 2026-04-07 01:45:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:32.546843 | orchestrator | 2026-04-07 01:45:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:32.546936 | orchestrator | 2026-04-07 01:45:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:35.600027 | orchestrator | 2026-04-07 01:45:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:35.601148 | orchestrator | 2026-04-07 01:45:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:35.601281 | orchestrator | 2026-04-07 01:45:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:38.652190 | orchestrator | 2026-04-07 01:45:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:38.654245 | orchestrator | 2026-04-07 01:45:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:38.654310 | orchestrator | 2026-04-07 01:45:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:41.707076 | orchestrator | 2026-04-07 01:45:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:41.707919 | orchestrator | 2026-04-07 01:45:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:41.707954 | orchestrator | 2026-04-07 01:45:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:44.754992 | orchestrator | 2026-04-07 01:45:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:44.757326 | orchestrator | 2026-04-07 01:45:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:44.757371 | orchestrator | 2026-04-07 01:45:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:47.809079 | orchestrator | 2026-04-07 01:45:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:47.812825 | orchestrator | 2026-04-07 01:45:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:47.812901 | orchestrator | 2026-04-07 01:45:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:50.863325 | orchestrator | 2026-04-07 01:45:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:50.864909 | orchestrator | 2026-04-07 01:45:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:50.864973 | orchestrator | 2026-04-07 01:45:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:53.916956 | orchestrator | 2026-04-07 01:45:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:53.918764 | orchestrator | 2026-04-07 01:45:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:53.918836 | orchestrator | 2026-04-07 01:45:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:45:56.961451 | orchestrator | 2026-04-07 01:45:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:45:56.962784 | orchestrator | 2026-04-07 01:45:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:45:56.962823 | orchestrator | 2026-04-07 01:45:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:00.019364 | orchestrator | 2026-04-07 01:46:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:00.020826 | orchestrator | 2026-04-07 01:46:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:00.020873 | orchestrator | 2026-04-07 01:46:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:03.072162 | orchestrator | 2026-04-07 01:46:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:03.073304 | orchestrator | 2026-04-07 01:46:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:03.073367 | orchestrator | 2026-04-07 01:46:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:06.129087 | orchestrator | 2026-04-07 01:46:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:06.130352 | orchestrator | 2026-04-07 01:46:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:06.130472 | orchestrator | 2026-04-07 01:46:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:09.186472 | orchestrator | 2026-04-07 01:46:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:09.190250 | orchestrator | 2026-04-07 01:46:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:09.190333 | orchestrator | 2026-04-07 01:46:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:12.237876 | orchestrator | 2026-04-07 01:46:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:12.240966 | orchestrator | 2026-04-07 01:46:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:12.241043 | orchestrator | 2026-04-07 01:46:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:15.295796 | orchestrator | 2026-04-07 01:46:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:15.297977 | orchestrator | 2026-04-07 01:46:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:15.298099 | orchestrator | 2026-04-07 01:46:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:18.347810 | orchestrator | 2026-04-07 01:46:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:18.350139 | orchestrator | 2026-04-07 01:46:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:18.350178 | orchestrator | 2026-04-07 01:46:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:21.402301 | orchestrator | 2026-04-07 01:46:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:21.404865 | orchestrator | 2026-04-07 01:46:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:21.404920 | orchestrator | 2026-04-07 01:46:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:24.455438 | orchestrator | 2026-04-07 01:46:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:24.458437 | orchestrator | 2026-04-07 01:46:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:24.458681 | orchestrator | 2026-04-07 01:46:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:27.508280 | orchestrator | 2026-04-07 01:46:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:27.510996 | orchestrator | 2026-04-07 01:46:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:27.511122 | orchestrator | 2026-04-07 01:46:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:30.556098 | orchestrator | 2026-04-07 01:46:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:30.556689 | orchestrator | 2026-04-07 01:46:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:30.556714 | orchestrator | 2026-04-07 01:46:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:33.614199 | orchestrator | 2026-04-07 01:46:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:33.615176 | orchestrator | 2026-04-07 01:46:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:33.615216 | orchestrator | 2026-04-07 01:46:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:36.662672 | orchestrator | 2026-04-07 01:46:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:36.663842 | orchestrator | 2026-04-07 01:46:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:36.664025 | orchestrator | 2026-04-07 01:46:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:39.711881 | orchestrator | 2026-04-07 01:46:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:39.714131 | orchestrator | 2026-04-07 01:46:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:39.714297 | orchestrator | 2026-04-07 01:46:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:42.760792 | orchestrator | 2026-04-07 01:46:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:42.762273 | orchestrator | 2026-04-07 01:46:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:42.762329 | orchestrator | 2026-04-07 01:46:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:45.808594 | orchestrator | 2026-04-07 01:46:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:45.812295 | orchestrator | 2026-04-07 01:46:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:45.813198 | orchestrator | 2026-04-07 01:46:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:48.866306 | orchestrator | 2026-04-07 01:46:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:48.867282 | orchestrator | 2026-04-07 01:46:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:48.867314 | orchestrator | 2026-04-07 01:46:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:51.922373 | orchestrator | 2026-04-07 01:46:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:51.924389 | orchestrator | 2026-04-07 01:46:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:51.924483 | orchestrator | 2026-04-07 01:46:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:54.969979 | orchestrator | 2026-04-07 01:46:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:54.971146 | orchestrator | 2026-04-07 01:46:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:54.971218 | orchestrator | 2026-04-07 01:46:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:46:58.020817 | orchestrator | 2026-04-07 01:46:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:46:58.022085 | orchestrator | 2026-04-07 01:46:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:46:58.022129 | orchestrator | 2026-04-07 01:46:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:01.062274 | orchestrator | 2026-04-07 01:47:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:01.064223 | orchestrator | 2026-04-07 01:47:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:01.064295 | orchestrator | 2026-04-07 01:47:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:04.113042 | orchestrator | 2026-04-07 01:47:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:04.115674 | orchestrator | 2026-04-07 01:47:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:04.115725 | orchestrator | 2026-04-07 01:47:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:07.165274 | orchestrator | 2026-04-07 01:47:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:07.166960 | orchestrator | 2026-04-07 01:47:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:07.167039 | orchestrator | 2026-04-07 01:47:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:10.216948 | orchestrator | 2026-04-07 01:47:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:10.220137 | orchestrator | 2026-04-07 01:47:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:10.220189 | orchestrator | 2026-04-07 01:47:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:13.268182 | orchestrator | 2026-04-07 01:47:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:13.269366 | orchestrator | 2026-04-07 01:47:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:13.269533 | orchestrator | 2026-04-07 01:47:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:16.320550 | orchestrator | 2026-04-07 01:47:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:16.321870 | orchestrator | 2026-04-07 01:47:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:16.321892 | orchestrator | 2026-04-07 01:47:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:19.374208 | orchestrator | 2026-04-07 01:47:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:19.375405 | orchestrator | 2026-04-07 01:47:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:19.375493 | orchestrator | 2026-04-07 01:47:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:22.429456 | orchestrator | 2026-04-07 01:47:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:22.430495 | orchestrator | 2026-04-07 01:47:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:22.430673 | orchestrator | 2026-04-07 01:47:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:25.480452 | orchestrator | 2026-04-07 01:47:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:25.482228 | orchestrator | 2026-04-07 01:47:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:25.482327 | orchestrator | 2026-04-07 01:47:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:28.530505 | orchestrator | 2026-04-07 01:47:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:28.531605 | orchestrator | 2026-04-07 01:47:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:28.531686 | orchestrator | 2026-04-07 01:47:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:31.586821 | orchestrator | 2026-04-07 01:47:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:31.587971 | orchestrator | 2026-04-07 01:47:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:31.588070 | orchestrator | 2026-04-07 01:47:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:34.632742 | orchestrator | 2026-04-07 01:47:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:34.633910 | orchestrator | 2026-04-07 01:47:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:34.634090 | orchestrator | 2026-04-07 01:47:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:37.684472 | orchestrator | 2026-04-07 01:47:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:37.686289 | orchestrator | 2026-04-07 01:47:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:37.686442 | orchestrator | 2026-04-07 01:47:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:40.740351 | orchestrator | 2026-04-07 01:47:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:40.741889 | orchestrator | 2026-04-07 01:47:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:40.742230 | orchestrator | 2026-04-07 01:47:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:43.797054 | orchestrator | 2026-04-07 01:47:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:43.798790 | orchestrator | 2026-04-07 01:47:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:43.798869 | orchestrator | 2026-04-07 01:47:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:46.864857 | orchestrator | 2026-04-07 01:47:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:46.866608 | orchestrator | 2026-04-07 01:47:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:46.866725 | orchestrator | 2026-04-07 01:47:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:49.917165 | orchestrator | 2026-04-07 01:47:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:49.920606 | orchestrator | 2026-04-07 01:47:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:49.920944 | orchestrator | 2026-04-07 01:47:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:52.973040 | orchestrator | 2026-04-07 01:47:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:52.975267 | orchestrator | 2026-04-07 01:47:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:52.975383 | orchestrator | 2026-04-07 01:47:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:56.015887 | orchestrator | 2026-04-07 01:47:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:56.017491 | orchestrator | 2026-04-07 01:47:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:56.017598 | orchestrator | 2026-04-07 01:47:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:47:59.067524 | orchestrator | 2026-04-07 01:47:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:47:59.067924 | orchestrator | 2026-04-07 01:47:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:47:59.067955 | orchestrator | 2026-04-07 01:47:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:02.109593 | orchestrator | 2026-04-07 01:48:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:02.109871 | orchestrator | 2026-04-07 01:48:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:02.109957 | orchestrator | 2026-04-07 01:48:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:05.160981 | orchestrator | 2026-04-07 01:48:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:05.162746 | orchestrator | 2026-04-07 01:48:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:05.162819 | orchestrator | 2026-04-07 01:48:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:08.219340 | orchestrator | 2026-04-07 01:48:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:08.220543 | orchestrator | 2026-04-07 01:48:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:08.220608 | orchestrator | 2026-04-07 01:48:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:11.278849 | orchestrator | 2026-04-07 01:48:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:11.280523 | orchestrator | 2026-04-07 01:48:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:11.280578 | orchestrator | 2026-04-07 01:48:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:14.324912 | orchestrator | 2026-04-07 01:48:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:14.326671 | orchestrator | 2026-04-07 01:48:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:14.326738 | orchestrator | 2026-04-07 01:48:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:17.373121 | orchestrator | 2026-04-07 01:48:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:17.374900 | orchestrator | 2026-04-07 01:48:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:17.374949 | orchestrator | 2026-04-07 01:48:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:20.429169 | orchestrator | 2026-04-07 01:48:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:20.430099 | orchestrator | 2026-04-07 01:48:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:20.430344 | orchestrator | 2026-04-07 01:48:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:23.478482 | orchestrator | 2026-04-07 01:48:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:23.480244 | orchestrator | 2026-04-07 01:48:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:23.480751 | orchestrator | 2026-04-07 01:48:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:26.527559 | orchestrator | 2026-04-07 01:48:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:26.529346 | orchestrator | 2026-04-07 01:48:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:26.529407 | orchestrator | 2026-04-07 01:48:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:29.577738 | orchestrator | 2026-04-07 01:48:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:29.578996 | orchestrator | 2026-04-07 01:48:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:29.579027 | orchestrator | 2026-04-07 01:48:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:32.624600 | orchestrator | 2026-04-07 01:48:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:32.627097 | orchestrator | 2026-04-07 01:48:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:32.627167 | orchestrator | 2026-04-07 01:48:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:35.676189 | orchestrator | 2026-04-07 01:48:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:35.677101 | orchestrator | 2026-04-07 01:48:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:35.677202 | orchestrator | 2026-04-07 01:48:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:38.724361 | orchestrator | 2026-04-07 01:48:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:38.726158 | orchestrator | 2026-04-07 01:48:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:38.726845 | orchestrator | 2026-04-07 01:48:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:41.774189 | orchestrator | 2026-04-07 01:48:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:41.775938 | orchestrator | 2026-04-07 01:48:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:41.775996 | orchestrator | 2026-04-07 01:48:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:44.823920 | orchestrator | 2026-04-07 01:48:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:44.824790 | orchestrator | 2026-04-07 01:48:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:44.824850 | orchestrator | 2026-04-07 01:48:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:47.875788 | orchestrator | 2026-04-07 01:48:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:47.879003 | orchestrator | 2026-04-07 01:48:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:47.879074 | orchestrator | 2026-04-07 01:48:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:50.928173 | orchestrator | 2026-04-07 01:48:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:50.931552 | orchestrator | 2026-04-07 01:48:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:50.931778 | orchestrator | 2026-04-07 01:48:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:53.981261 | orchestrator | 2026-04-07 01:48:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:53.982820 | orchestrator | 2026-04-07 01:48:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:53.982892 | orchestrator | 2026-04-07 01:48:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:48:57.034199 | orchestrator | 2026-04-07 01:48:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:48:57.035341 | orchestrator | 2026-04-07 01:48:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:48:57.035394 | orchestrator | 2026-04-07 01:48:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:00.074923 | orchestrator | 2026-04-07 01:49:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:00.078078 | orchestrator | 2026-04-07 01:49:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:00.078178 | orchestrator | 2026-04-07 01:49:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:03.128615 | orchestrator | 2026-04-07 01:49:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:03.130287 | orchestrator | 2026-04-07 01:49:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:03.130435 | orchestrator | 2026-04-07 01:49:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:06.179880 | orchestrator | 2026-04-07 01:49:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:06.182010 | orchestrator | 2026-04-07 01:49:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:06.182082 | orchestrator | 2026-04-07 01:49:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:09.236834 | orchestrator | 2026-04-07 01:49:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:09.240169 | orchestrator | 2026-04-07 01:49:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:09.240229 | orchestrator | 2026-04-07 01:49:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:12.291937 | orchestrator | 2026-04-07 01:49:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:12.296096 | orchestrator | 2026-04-07 01:49:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:12.296265 | orchestrator | 2026-04-07 01:49:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:15.347688 | orchestrator | 2026-04-07 01:49:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:15.352192 | orchestrator | 2026-04-07 01:49:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:15.352219 | orchestrator | 2026-04-07 01:49:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:18.396190 | orchestrator | 2026-04-07 01:49:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:18.397366 | orchestrator | 2026-04-07 01:49:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:18.397439 | orchestrator | 2026-04-07 01:49:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:21.447322 | orchestrator | 2026-04-07 01:49:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:21.449173 | orchestrator | 2026-04-07 01:49:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:21.449232 | orchestrator | 2026-04-07 01:49:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:24.504199 | orchestrator | 2026-04-07 01:49:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:24.507947 | orchestrator | 2026-04-07 01:49:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:24.508005 | orchestrator | 2026-04-07 01:49:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:27.559823 | orchestrator | 2026-04-07 01:49:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:27.562082 | orchestrator | 2026-04-07 01:49:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:27.562136 | orchestrator | 2026-04-07 01:49:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:30.614585 | orchestrator | 2026-04-07 01:49:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:30.615748 | orchestrator | 2026-04-07 01:49:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:30.616279 | orchestrator | 2026-04-07 01:49:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:33.671251 | orchestrator | 2026-04-07 01:49:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:33.672160 | orchestrator | 2026-04-07 01:49:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:33.672323 | orchestrator | 2026-04-07 01:49:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:36.725314 | orchestrator | 2026-04-07 01:49:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:36.727092 | orchestrator | 2026-04-07 01:49:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:36.727179 | orchestrator | 2026-04-07 01:49:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:39.777795 | orchestrator | 2026-04-07 01:49:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:39.779272 | orchestrator | 2026-04-07 01:49:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:39.779292 | orchestrator | 2026-04-07 01:49:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:42.830481 | orchestrator | 2026-04-07 01:49:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:42.831448 | orchestrator | 2026-04-07 01:49:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:42.831489 | orchestrator | 2026-04-07 01:49:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:45.880251 | orchestrator | 2026-04-07 01:49:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:45.881239 | orchestrator | 2026-04-07 01:49:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:45.881307 | orchestrator | 2026-04-07 01:49:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:48.930331 | orchestrator | 2026-04-07 01:49:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:48.930575 | orchestrator | 2026-04-07 01:49:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:48.930598 | orchestrator | 2026-04-07 01:49:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:51.983763 | orchestrator | 2026-04-07 01:49:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:51.985298 | orchestrator | 2026-04-07 01:49:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:51.985354 | orchestrator | 2026-04-07 01:49:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:55.042601 | orchestrator | 2026-04-07 01:49:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:55.046359 | orchestrator | 2026-04-07 01:49:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:55.046460 | orchestrator | 2026-04-07 01:49:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:49:58.092332 | orchestrator | 2026-04-07 01:49:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:49:58.094298 | orchestrator | 2026-04-07 01:49:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:49:58.094361 | orchestrator | 2026-04-07 01:49:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:01.141956 | orchestrator | 2026-04-07 01:50:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:01.144393 | orchestrator | 2026-04-07 01:50:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:01.144596 | orchestrator | 2026-04-07 01:50:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:04.200187 | orchestrator | 2026-04-07 01:50:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:04.201790 | orchestrator | 2026-04-07 01:50:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:04.201938 | orchestrator | 2026-04-07 01:50:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:07.245892 | orchestrator | 2026-04-07 01:50:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:07.247293 | orchestrator | 2026-04-07 01:50:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:07.247328 | orchestrator | 2026-04-07 01:50:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:10.299145 | orchestrator | 2026-04-07 01:50:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:10.299710 | orchestrator | 2026-04-07 01:50:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:10.299794 | orchestrator | 2026-04-07 01:50:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:13.347962 | orchestrator | 2026-04-07 01:50:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:13.349050 | orchestrator | 2026-04-07 01:50:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:13.349088 | orchestrator | 2026-04-07 01:50:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:16.399711 | orchestrator | 2026-04-07 01:50:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:16.401394 | orchestrator | 2026-04-07 01:50:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:16.401531 | orchestrator | 2026-04-07 01:50:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:19.448933 | orchestrator | 2026-04-07 01:50:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:19.450252 | orchestrator | 2026-04-07 01:50:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:19.450686 | orchestrator | 2026-04-07 01:50:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:22.502474 | orchestrator | 2026-04-07 01:50:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:22.505199 | orchestrator | 2026-04-07 01:50:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:22.505277 | orchestrator | 2026-04-07 01:50:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:25.559467 | orchestrator | 2026-04-07 01:50:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:25.560817 | orchestrator | 2026-04-07 01:50:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:25.560874 | orchestrator | 2026-04-07 01:50:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:28.610659 | orchestrator | 2026-04-07 01:50:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:28.614287 | orchestrator | 2026-04-07 01:50:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:28.614560 | orchestrator | 2026-04-07 01:50:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:31.667183 | orchestrator | 2026-04-07 01:50:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:31.669685 | orchestrator | 2026-04-07 01:50:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:31.669962 | orchestrator | 2026-04-07 01:50:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:34.724698 | orchestrator | 2026-04-07 01:50:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:34.726114 | orchestrator | 2026-04-07 01:50:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:34.726176 | orchestrator | 2026-04-07 01:50:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:37.782407 | orchestrator | 2026-04-07 01:50:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:37.784423 | orchestrator | 2026-04-07 01:50:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:37.784479 | orchestrator | 2026-04-07 01:50:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:40.832402 | orchestrator | 2026-04-07 01:50:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:40.834432 | orchestrator | 2026-04-07 01:50:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:40.834601 | orchestrator | 2026-04-07 01:50:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:43.882432 | orchestrator | 2026-04-07 01:50:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:43.884769 | orchestrator | 2026-04-07 01:50:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:43.884834 | orchestrator | 2026-04-07 01:50:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:46.941311 | orchestrator | 2026-04-07 01:50:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:46.944677 | orchestrator | 2026-04-07 01:50:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:46.944740 | orchestrator | 2026-04-07 01:50:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:49.999091 | orchestrator | 2026-04-07 01:50:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:50.004111 | orchestrator | 2026-04-07 01:50:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:50.005331 | orchestrator | 2026-04-07 01:50:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:53.061962 | orchestrator | 2026-04-07 01:50:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:53.065130 | orchestrator | 2026-04-07 01:50:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:53.065184 | orchestrator | 2026-04-07 01:50:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:56.122105 | orchestrator | 2026-04-07 01:50:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:56.123353 | orchestrator | 2026-04-07 01:50:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:56.123467 | orchestrator | 2026-04-07 01:50:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:50:59.175414 | orchestrator | 2026-04-07 01:50:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:50:59.176547 | orchestrator | 2026-04-07 01:50:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:50:59.176597 | orchestrator | 2026-04-07 01:50:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:02.232429 | orchestrator | 2026-04-07 01:51:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:02.234621 | orchestrator | 2026-04-07 01:51:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:02.234990 | orchestrator | 2026-04-07 01:51:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:05.280803 | orchestrator | 2026-04-07 01:51:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:05.284272 | orchestrator | 2026-04-07 01:51:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:05.284361 | orchestrator | 2026-04-07 01:51:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:08.337175 | orchestrator | 2026-04-07 01:51:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:08.339489 | orchestrator | 2026-04-07 01:51:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:08.339577 | orchestrator | 2026-04-07 01:51:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:11.385769 | orchestrator | 2026-04-07 01:51:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:11.387837 | orchestrator | 2026-04-07 01:51:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:11.387964 | orchestrator | 2026-04-07 01:51:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:14.440722 | orchestrator | 2026-04-07 01:51:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:14.441770 | orchestrator | 2026-04-07 01:51:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:14.441808 | orchestrator | 2026-04-07 01:51:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:17.485267 | orchestrator | 2026-04-07 01:51:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:17.487125 | orchestrator | 2026-04-07 01:51:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:17.487224 | orchestrator | 2026-04-07 01:51:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:20.533553 | orchestrator | 2026-04-07 01:51:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:20.535013 | orchestrator | 2026-04-07 01:51:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:20.535107 | orchestrator | 2026-04-07 01:51:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:23.586778 | orchestrator | 2026-04-07 01:51:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:23.589479 | orchestrator | 2026-04-07 01:51:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:23.589519 | orchestrator | 2026-04-07 01:51:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:26.642885 | orchestrator | 2026-04-07 01:51:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:26.644152 | orchestrator | 2026-04-07 01:51:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:26.644220 | orchestrator | 2026-04-07 01:51:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:29.693911 | orchestrator | 2026-04-07 01:51:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:29.695439 | orchestrator | 2026-04-07 01:51:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:29.695488 | orchestrator | 2026-04-07 01:51:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:32.742574 | orchestrator | 2026-04-07 01:51:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:32.743579 | orchestrator | 2026-04-07 01:51:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:32.743622 | orchestrator | 2026-04-07 01:51:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:35.795808 | orchestrator | 2026-04-07 01:51:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:35.796911 | orchestrator | 2026-04-07 01:51:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:35.796971 | orchestrator | 2026-04-07 01:51:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:38.848484 | orchestrator | 2026-04-07 01:51:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:38.850243 | orchestrator | 2026-04-07 01:51:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:38.850340 | orchestrator | 2026-04-07 01:51:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:41.899601 | orchestrator | 2026-04-07 01:51:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:41.900454 | orchestrator | 2026-04-07 01:51:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:41.900492 | orchestrator | 2026-04-07 01:51:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:44.951783 | orchestrator | 2026-04-07 01:51:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:44.952958 | orchestrator | 2026-04-07 01:51:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:44.952999 | orchestrator | 2026-04-07 01:51:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:48.002339 | orchestrator | 2026-04-07 01:51:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:48.005052 | orchestrator | 2026-04-07 01:51:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:48.005133 | orchestrator | 2026-04-07 01:51:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:51.057666 | orchestrator | 2026-04-07 01:51:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:51.059556 | orchestrator | 2026-04-07 01:51:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:51.059790 | orchestrator | 2026-04-07 01:51:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:54.113934 | orchestrator | 2026-04-07 01:51:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:54.115928 | orchestrator | 2026-04-07 01:51:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:54.115982 | orchestrator | 2026-04-07 01:51:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:51:57.167516 | orchestrator | 2026-04-07 01:51:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:51:57.169381 | orchestrator | 2026-04-07 01:51:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:51:57.169494 | orchestrator | 2026-04-07 01:51:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:00.215354 | orchestrator | 2026-04-07 01:52:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:00.218117 | orchestrator | 2026-04-07 01:52:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:00.218191 | orchestrator | 2026-04-07 01:52:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:03.260998 | orchestrator | 2026-04-07 01:52:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:03.261738 | orchestrator | 2026-04-07 01:52:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:03.261808 | orchestrator | 2026-04-07 01:52:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:06.310703 | orchestrator | 2026-04-07 01:52:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:06.314676 | orchestrator | 2026-04-07 01:52:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:06.314743 | orchestrator | 2026-04-07 01:52:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:09.374323 | orchestrator | 2026-04-07 01:52:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:09.378608 | orchestrator | 2026-04-07 01:52:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:09.378733 | orchestrator | 2026-04-07 01:52:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:12.432187 | orchestrator | 2026-04-07 01:52:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:12.438097 | orchestrator | 2026-04-07 01:52:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:12.438176 | orchestrator | 2026-04-07 01:52:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:15.495164 | orchestrator | 2026-04-07 01:52:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:15.496919 | orchestrator | 2026-04-07 01:52:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:15.496989 | orchestrator | 2026-04-07 01:52:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:18.548160 | orchestrator | 2026-04-07 01:52:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:18.552320 | orchestrator | 2026-04-07 01:52:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:18.552401 | orchestrator | 2026-04-07 01:52:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:21.603401 | orchestrator | 2026-04-07 01:52:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:21.604388 | orchestrator | 2026-04-07 01:52:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:21.604412 | orchestrator | 2026-04-07 01:52:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:24.656502 | orchestrator | 2026-04-07 01:52:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:24.658279 | orchestrator | 2026-04-07 01:52:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:24.658387 | orchestrator | 2026-04-07 01:52:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:27.712675 | orchestrator | 2026-04-07 01:52:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:27.715127 | orchestrator | 2026-04-07 01:52:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:27.715308 | orchestrator | 2026-04-07 01:52:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:30.764708 | orchestrator | 2026-04-07 01:52:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:30.764889 | orchestrator | 2026-04-07 01:52:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:30.764910 | orchestrator | 2026-04-07 01:52:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:33.814234 | orchestrator | 2026-04-07 01:52:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:33.815894 | orchestrator | 2026-04-07 01:52:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:33.816011 | orchestrator | 2026-04-07 01:52:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:36.866192 | orchestrator | 2026-04-07 01:52:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:36.868144 | orchestrator | 2026-04-07 01:52:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:36.868187 | orchestrator | 2026-04-07 01:52:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:39.916483 | orchestrator | 2026-04-07 01:52:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:39.918522 | orchestrator | 2026-04-07 01:52:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:39.918681 | orchestrator | 2026-04-07 01:52:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:42.967921 | orchestrator | 2026-04-07 01:52:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:42.968704 | orchestrator | 2026-04-07 01:52:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:42.968856 | orchestrator | 2026-04-07 01:52:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:46.024999 | orchestrator | 2026-04-07 01:52:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:46.026078 | orchestrator | 2026-04-07 01:52:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:46.026264 | orchestrator | 2026-04-07 01:52:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:49.073934 | orchestrator | 2026-04-07 01:52:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:49.075917 | orchestrator | 2026-04-07 01:52:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:49.075993 | orchestrator | 2026-04-07 01:52:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:52.126802 | orchestrator | 2026-04-07 01:52:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:52.129678 | orchestrator | 2026-04-07 01:52:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:52.129734 | orchestrator | 2026-04-07 01:52:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:55.177793 | orchestrator | 2026-04-07 01:52:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:55.179261 | orchestrator | 2026-04-07 01:52:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:55.179291 | orchestrator | 2026-04-07 01:52:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:52:58.232065 | orchestrator | 2026-04-07 01:52:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:52:58.234224 | orchestrator | 2026-04-07 01:52:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:52:58.234651 | orchestrator | 2026-04-07 01:52:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:01.286121 | orchestrator | 2026-04-07 01:53:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:01.287723 | orchestrator | 2026-04-07 01:53:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:01.287786 | orchestrator | 2026-04-07 01:53:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:04.336540 | orchestrator | 2026-04-07 01:53:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:04.338806 | orchestrator | 2026-04-07 01:53:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:04.338857 | orchestrator | 2026-04-07 01:53:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:07.388784 | orchestrator | 2026-04-07 01:53:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:07.390737 | orchestrator | 2026-04-07 01:53:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:07.390850 | orchestrator | 2026-04-07 01:53:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:10.442494 | orchestrator | 2026-04-07 01:53:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:10.443334 | orchestrator | 2026-04-07 01:53:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:10.443374 | orchestrator | 2026-04-07 01:53:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:13.495295 | orchestrator | 2026-04-07 01:53:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:13.497168 | orchestrator | 2026-04-07 01:53:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:13.497416 | orchestrator | 2026-04-07 01:53:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:16.546450 | orchestrator | 2026-04-07 01:53:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:16.547425 | orchestrator | 2026-04-07 01:53:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:16.547472 | orchestrator | 2026-04-07 01:53:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:19.593994 | orchestrator | 2026-04-07 01:53:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:19.594340 | orchestrator | 2026-04-07 01:53:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:19.594769 | orchestrator | 2026-04-07 01:53:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:22.643809 | orchestrator | 2026-04-07 01:53:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:22.646503 | orchestrator | 2026-04-07 01:53:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:22.646581 | orchestrator | 2026-04-07 01:53:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:25.690121 | orchestrator | 2026-04-07 01:53:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:25.691359 | orchestrator | 2026-04-07 01:53:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:25.691420 | orchestrator | 2026-04-07 01:53:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:28.742944 | orchestrator | 2026-04-07 01:53:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:28.745572 | orchestrator | 2026-04-07 01:53:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:28.745690 | orchestrator | 2026-04-07 01:53:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:31.786702 | orchestrator | 2026-04-07 01:53:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:31.787773 | orchestrator | 2026-04-07 01:53:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:31.787827 | orchestrator | 2026-04-07 01:53:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:34.832755 | orchestrator | 2026-04-07 01:53:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:34.834296 | orchestrator | 2026-04-07 01:53:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:34.834357 | orchestrator | 2026-04-07 01:53:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:37.883141 | orchestrator | 2026-04-07 01:53:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:37.884080 | orchestrator | 2026-04-07 01:53:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:37.884115 | orchestrator | 2026-04-07 01:53:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:40.934849 | orchestrator | 2026-04-07 01:53:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:40.936640 | orchestrator | 2026-04-07 01:53:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:40.936707 | orchestrator | 2026-04-07 01:53:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:43.993134 | orchestrator | 2026-04-07 01:53:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:43.996478 | orchestrator | 2026-04-07 01:53:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:43.996569 | orchestrator | 2026-04-07 01:53:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:47.042782 | orchestrator | 2026-04-07 01:53:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:47.045414 | orchestrator | 2026-04-07 01:53:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:47.045471 | orchestrator | 2026-04-07 01:53:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:50.093444 | orchestrator | 2026-04-07 01:53:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:50.095463 | orchestrator | 2026-04-07 01:53:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:50.095586 | orchestrator | 2026-04-07 01:53:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:53.142858 | orchestrator | 2026-04-07 01:53:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:53.144569 | orchestrator | 2026-04-07 01:53:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:53.144703 | orchestrator | 2026-04-07 01:53:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:56.191342 | orchestrator | 2026-04-07 01:53:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:56.193635 | orchestrator | 2026-04-07 01:53:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:56.193750 | orchestrator | 2026-04-07 01:53:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:53:59.236756 | orchestrator | 2026-04-07 01:53:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:53:59.237287 | orchestrator | 2026-04-07 01:53:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:53:59.237323 | orchestrator | 2026-04-07 01:53:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:02.293642 | orchestrator | 2026-04-07 01:54:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:02.295142 | orchestrator | 2026-04-07 01:54:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:02.295358 | orchestrator | 2026-04-07 01:54:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:05.343980 | orchestrator | 2026-04-07 01:54:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:05.345237 | orchestrator | 2026-04-07 01:54:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:05.345292 | orchestrator | 2026-04-07 01:54:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:08.395820 | orchestrator | 2026-04-07 01:54:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:08.397268 | orchestrator | 2026-04-07 01:54:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:08.397350 | orchestrator | 2026-04-07 01:54:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:11.445829 | orchestrator | 2026-04-07 01:54:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:11.447231 | orchestrator | 2026-04-07 01:54:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:11.447299 | orchestrator | 2026-04-07 01:54:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:14.499032 | orchestrator | 2026-04-07 01:54:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:14.502283 | orchestrator | 2026-04-07 01:54:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:14.502356 | orchestrator | 2026-04-07 01:54:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:17.548243 | orchestrator | 2026-04-07 01:54:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:17.549805 | orchestrator | 2026-04-07 01:54:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:17.549860 | orchestrator | 2026-04-07 01:54:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:20.598449 | orchestrator | 2026-04-07 01:54:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:20.600448 | orchestrator | 2026-04-07 01:54:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:20.600498 | orchestrator | 2026-04-07 01:54:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:23.650775 | orchestrator | 2026-04-07 01:54:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:23.652789 | orchestrator | 2026-04-07 01:54:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:23.652863 | orchestrator | 2026-04-07 01:54:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:26.692097 | orchestrator | 2026-04-07 01:54:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:26.693252 | orchestrator | 2026-04-07 01:54:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:26.693303 | orchestrator | 2026-04-07 01:54:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:29.742270 | orchestrator | 2026-04-07 01:54:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:29.743264 | orchestrator | 2026-04-07 01:54:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:29.743323 | orchestrator | 2026-04-07 01:54:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:32.792300 | orchestrator | 2026-04-07 01:54:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:32.795970 | orchestrator | 2026-04-07 01:54:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:32.796047 | orchestrator | 2026-04-07 01:54:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:35.842688 | orchestrator | 2026-04-07 01:54:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:35.845524 | orchestrator | 2026-04-07 01:54:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:35.845683 | orchestrator | 2026-04-07 01:54:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:38.894932 | orchestrator | 2026-04-07 01:54:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:38.896404 | orchestrator | 2026-04-07 01:54:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:38.896506 | orchestrator | 2026-04-07 01:54:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:41.944474 | orchestrator | 2026-04-07 01:54:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:41.945137 | orchestrator | 2026-04-07 01:54:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:41.945181 | orchestrator | 2026-04-07 01:54:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:44.995796 | orchestrator | 2026-04-07 01:54:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:44.996921 | orchestrator | 2026-04-07 01:54:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:44.997008 | orchestrator | 2026-04-07 01:54:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:48.049676 | orchestrator | 2026-04-07 01:54:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:48.050458 | orchestrator | 2026-04-07 01:54:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:48.050739 | orchestrator | 2026-04-07 01:54:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:51.100699 | orchestrator | 2026-04-07 01:54:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:51.102116 | orchestrator | 2026-04-07 01:54:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:51.102160 | orchestrator | 2026-04-07 01:54:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:54.153402 | orchestrator | 2026-04-07 01:54:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:54.156575 | orchestrator | 2026-04-07 01:54:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:54.156667 | orchestrator | 2026-04-07 01:54:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:54:57.202447 | orchestrator | 2026-04-07 01:54:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:54:57.204415 | orchestrator | 2026-04-07 01:54:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:54:57.204487 | orchestrator | 2026-04-07 01:54:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:00.252562 | orchestrator | 2026-04-07 01:55:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:00.255249 | orchestrator | 2026-04-07 01:55:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:00.255342 | orchestrator | 2026-04-07 01:55:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:03.294457 | orchestrator | 2026-04-07 01:55:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:03.295668 | orchestrator | 2026-04-07 01:55:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:03.295722 | orchestrator | 2026-04-07 01:55:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:06.340528 | orchestrator | 2026-04-07 01:55:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:06.342186 | orchestrator | 2026-04-07 01:55:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:06.342245 | orchestrator | 2026-04-07 01:55:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:09.384293 | orchestrator | 2026-04-07 01:55:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:09.385763 | orchestrator | 2026-04-07 01:55:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:09.385843 | orchestrator | 2026-04-07 01:55:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:12.436559 | orchestrator | 2026-04-07 01:55:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:12.438177 | orchestrator | 2026-04-07 01:55:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:12.438211 | orchestrator | 2026-04-07 01:55:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:15.491389 | orchestrator | 2026-04-07 01:55:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:15.492654 | orchestrator | 2026-04-07 01:55:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:15.492695 | orchestrator | 2026-04-07 01:55:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:18.539331 | orchestrator | 2026-04-07 01:55:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:18.540162 | orchestrator | 2026-04-07 01:55:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:18.540264 | orchestrator | 2026-04-07 01:55:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:21.588641 | orchestrator | 2026-04-07 01:55:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:21.590867 | orchestrator | 2026-04-07 01:55:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:21.590948 | orchestrator | 2026-04-07 01:55:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:24.636469 | orchestrator | 2026-04-07 01:55:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:24.637359 | orchestrator | 2026-04-07 01:55:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:24.637415 | orchestrator | 2026-04-07 01:55:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:27.686408 | orchestrator | 2026-04-07 01:55:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:27.688186 | orchestrator | 2026-04-07 01:55:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:27.688231 | orchestrator | 2026-04-07 01:55:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:30.738361 | orchestrator | 2026-04-07 01:55:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:30.740148 | orchestrator | 2026-04-07 01:55:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:30.740182 | orchestrator | 2026-04-07 01:55:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:33.791269 | orchestrator | 2026-04-07 01:55:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:33.792542 | orchestrator | 2026-04-07 01:55:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:33.792614 | orchestrator | 2026-04-07 01:55:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:36.848957 | orchestrator | 2026-04-07 01:55:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:36.852237 | orchestrator | 2026-04-07 01:55:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:36.852303 | orchestrator | 2026-04-07 01:55:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:39.906488 | orchestrator | 2026-04-07 01:55:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:39.909696 | orchestrator | 2026-04-07 01:55:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:39.909809 | orchestrator | 2026-04-07 01:55:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:42.956692 | orchestrator | 2026-04-07 01:55:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:42.958416 | orchestrator | 2026-04-07 01:55:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:42.958501 | orchestrator | 2026-04-07 01:55:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:46.019705 | orchestrator | 2026-04-07 01:55:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:46.020136 | orchestrator | 2026-04-07 01:55:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:46.020251 | orchestrator | 2026-04-07 01:55:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:49.077040 | orchestrator | 2026-04-07 01:55:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:49.078512 | orchestrator | 2026-04-07 01:55:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:49.078562 | orchestrator | 2026-04-07 01:55:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:52.123188 | orchestrator | 2026-04-07 01:55:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:52.124564 | orchestrator | 2026-04-07 01:55:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:52.124751 | orchestrator | 2026-04-07 01:55:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:55.172167 | orchestrator | 2026-04-07 01:55:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:55.174443 | orchestrator | 2026-04-07 01:55:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:55.174857 | orchestrator | 2026-04-07 01:55:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:55:58.224008 | orchestrator | 2026-04-07 01:55:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:55:58.227301 | orchestrator | 2026-04-07 01:55:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:55:58.227341 | orchestrator | 2026-04-07 01:55:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:01.284726 | orchestrator | 2026-04-07 01:56:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:01.285826 | orchestrator | 2026-04-07 01:56:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:01.285859 | orchestrator | 2026-04-07 01:56:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:04.330738 | orchestrator | 2026-04-07 01:56:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:04.331891 | orchestrator | 2026-04-07 01:56:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:04.332087 | orchestrator | 2026-04-07 01:56:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:07.380965 | orchestrator | 2026-04-07 01:56:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:07.383080 | orchestrator | 2026-04-07 01:56:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:07.383191 | orchestrator | 2026-04-07 01:56:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:10.430295 | orchestrator | 2026-04-07 01:56:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:10.433658 | orchestrator | 2026-04-07 01:56:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:10.433730 | orchestrator | 2026-04-07 01:56:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:13.479768 | orchestrator | 2026-04-07 01:56:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:13.481636 | orchestrator | 2026-04-07 01:56:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:13.481716 | orchestrator | 2026-04-07 01:56:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:16.527397 | orchestrator | 2026-04-07 01:56:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:16.530634 | orchestrator | 2026-04-07 01:56:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:16.530727 | orchestrator | 2026-04-07 01:56:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:19.578898 | orchestrator | 2026-04-07 01:56:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:19.579975 | orchestrator | 2026-04-07 01:56:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:19.580109 | orchestrator | 2026-04-07 01:56:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:22.624202 | orchestrator | 2026-04-07 01:56:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:22.628799 | orchestrator | 2026-04-07 01:56:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:22.628869 | orchestrator | 2026-04-07 01:56:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:25.676920 | orchestrator | 2026-04-07 01:56:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:25.678725 | orchestrator | 2026-04-07 01:56:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:25.678751 | orchestrator | 2026-04-07 01:56:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:28.727216 | orchestrator | 2026-04-07 01:56:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:28.728911 | orchestrator | 2026-04-07 01:56:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:28.728970 | orchestrator | 2026-04-07 01:56:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:31.783049 | orchestrator | 2026-04-07 01:56:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:31.784179 | orchestrator | 2026-04-07 01:56:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:31.784376 | orchestrator | 2026-04-07 01:56:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:34.830702 | orchestrator | 2026-04-07 01:56:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:34.832124 | orchestrator | 2026-04-07 01:56:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:34.832161 | orchestrator | 2026-04-07 01:56:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:37.874422 | orchestrator | 2026-04-07 01:56:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:37.876339 | orchestrator | 2026-04-07 01:56:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:37.876390 | orchestrator | 2026-04-07 01:56:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:40.919823 | orchestrator | 2026-04-07 01:56:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:40.921151 | orchestrator | 2026-04-07 01:56:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:40.921309 | orchestrator | 2026-04-07 01:56:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:43.969224 | orchestrator | 2026-04-07 01:56:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:43.972526 | orchestrator | 2026-04-07 01:56:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:43.972731 | orchestrator | 2026-04-07 01:56:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:47.031091 | orchestrator | 2026-04-07 01:56:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:47.031471 | orchestrator | 2026-04-07 01:56:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:47.031516 | orchestrator | 2026-04-07 01:56:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:50.085136 | orchestrator | 2026-04-07 01:56:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:50.086556 | orchestrator | 2026-04-07 01:56:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:50.086662 | orchestrator | 2026-04-07 01:56:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:53.142277 | orchestrator | 2026-04-07 01:56:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:53.144088 | orchestrator | 2026-04-07 01:56:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:53.144217 | orchestrator | 2026-04-07 01:56:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:56.192182 | orchestrator | 2026-04-07 01:56:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:56.194528 | orchestrator | 2026-04-07 01:56:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:56.194643 | orchestrator | 2026-04-07 01:56:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:56:59.242162 | orchestrator | 2026-04-07 01:56:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:56:59.243904 | orchestrator | 2026-04-07 01:56:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:56:59.243955 | orchestrator | 2026-04-07 01:56:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:02.299270 | orchestrator | 2026-04-07 01:57:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:02.305477 | orchestrator | 2026-04-07 01:57:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:02.305553 | orchestrator | 2026-04-07 01:57:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:05.354875 | orchestrator | 2026-04-07 01:57:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:05.355785 | orchestrator | 2026-04-07 01:57:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:05.355920 | orchestrator | 2026-04-07 01:57:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:08.409757 | orchestrator | 2026-04-07 01:57:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:08.411557 | orchestrator | 2026-04-07 01:57:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:08.411803 | orchestrator | 2026-04-07 01:57:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:11.462539 | orchestrator | 2026-04-07 01:57:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:11.462756 | orchestrator | 2026-04-07 01:57:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:11.463317 | orchestrator | 2026-04-07 01:57:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:14.508268 | orchestrator | 2026-04-07 01:57:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:14.509629 | orchestrator | 2026-04-07 01:57:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:14.509673 | orchestrator | 2026-04-07 01:57:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:17.561941 | orchestrator | 2026-04-07 01:57:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:17.563374 | orchestrator | 2026-04-07 01:57:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:17.563659 | orchestrator | 2026-04-07 01:57:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:20.612913 | orchestrator | 2026-04-07 01:57:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:20.614652 | orchestrator | 2026-04-07 01:57:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:20.614721 | orchestrator | 2026-04-07 01:57:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:23.664708 | orchestrator | 2026-04-07 01:57:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:23.666538 | orchestrator | 2026-04-07 01:57:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:23.666752 | orchestrator | 2026-04-07 01:57:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:26.716169 | orchestrator | 2026-04-07 01:57:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:26.718310 | orchestrator | 2026-04-07 01:57:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:26.718377 | orchestrator | 2026-04-07 01:57:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:29.768785 | orchestrator | 2026-04-07 01:57:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:29.769980 | orchestrator | 2026-04-07 01:57:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:29.770162 | orchestrator | 2026-04-07 01:57:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:32.830814 | orchestrator | 2026-04-07 01:57:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:32.832483 | orchestrator | 2026-04-07 01:57:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:32.832725 | orchestrator | 2026-04-07 01:57:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:35.885708 | orchestrator | 2026-04-07 01:57:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:35.886377 | orchestrator | 2026-04-07 01:57:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:35.886425 | orchestrator | 2026-04-07 01:57:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:38.938340 | orchestrator | 2026-04-07 01:57:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:38.940335 | orchestrator | 2026-04-07 01:57:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:38.940415 | orchestrator | 2026-04-07 01:57:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:42.000676 | orchestrator | 2026-04-07 01:57:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:42.002902 | orchestrator | 2026-04-07 01:57:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:42.003034 | orchestrator | 2026-04-07 01:57:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:45.049207 | orchestrator | 2026-04-07 01:57:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:45.050913 | orchestrator | 2026-04-07 01:57:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:45.051307 | orchestrator | 2026-04-07 01:57:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:48.095115 | orchestrator | 2026-04-07 01:57:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:48.096800 | orchestrator | 2026-04-07 01:57:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:48.096886 | orchestrator | 2026-04-07 01:57:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:51.147846 | orchestrator | 2026-04-07 01:57:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:51.147918 | orchestrator | 2026-04-07 01:57:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:51.147925 | orchestrator | 2026-04-07 01:57:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:54.195822 | orchestrator | 2026-04-07 01:57:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:54.199058 | orchestrator | 2026-04-07 01:57:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:54.199146 | orchestrator | 2026-04-07 01:57:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:57:57.247310 | orchestrator | 2026-04-07 01:57:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:57:57.249163 | orchestrator | 2026-04-07 01:57:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:57:57.249212 | orchestrator | 2026-04-07 01:57:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:00.295294 | orchestrator | 2026-04-07 01:58:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:00.298627 | orchestrator | 2026-04-07 01:58:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:00.298710 | orchestrator | 2026-04-07 01:58:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:03.347005 | orchestrator | 2026-04-07 01:58:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:03.348685 | orchestrator | 2026-04-07 01:58:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:03.348738 | orchestrator | 2026-04-07 01:58:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:06.391959 | orchestrator | 2026-04-07 01:58:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:06.393832 | orchestrator | 2026-04-07 01:58:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:06.393878 | orchestrator | 2026-04-07 01:58:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:09.444260 | orchestrator | 2026-04-07 01:58:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:09.445938 | orchestrator | 2026-04-07 01:58:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:09.446010 | orchestrator | 2026-04-07 01:58:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:12.495153 | orchestrator | 2026-04-07 01:58:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:12.496943 | orchestrator | 2026-04-07 01:58:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:12.497030 | orchestrator | 2026-04-07 01:58:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:15.548915 | orchestrator | 2026-04-07 01:58:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:15.553276 | orchestrator | 2026-04-07 01:58:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:15.553380 | orchestrator | 2026-04-07 01:58:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:18.604936 | orchestrator | 2026-04-07 01:58:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:18.606668 | orchestrator | 2026-04-07 01:58:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:18.606732 | orchestrator | 2026-04-07 01:58:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:21.655146 | orchestrator | 2026-04-07 01:58:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:21.657369 | orchestrator | 2026-04-07 01:58:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:21.657441 | orchestrator | 2026-04-07 01:58:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:24.705813 | orchestrator | 2026-04-07 01:58:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:24.707907 | orchestrator | 2026-04-07 01:58:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:24.707980 | orchestrator | 2026-04-07 01:58:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:27.764505 | orchestrator | 2026-04-07 01:58:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:27.767168 | orchestrator | 2026-04-07 01:58:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:27.767246 | orchestrator | 2026-04-07 01:58:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:30.817321 | orchestrator | 2026-04-07 01:58:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:30.819562 | orchestrator | 2026-04-07 01:58:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:30.819694 | orchestrator | 2026-04-07 01:58:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:33.860748 | orchestrator | 2026-04-07 01:58:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:33.861632 | orchestrator | 2026-04-07 01:58:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:33.861668 | orchestrator | 2026-04-07 01:58:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:36.911325 | orchestrator | 2026-04-07 01:58:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:36.913188 | orchestrator | 2026-04-07 01:58:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:36.913251 | orchestrator | 2026-04-07 01:58:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:39.961044 | orchestrator | 2026-04-07 01:58:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:39.963002 | orchestrator | 2026-04-07 01:58:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:39.963160 | orchestrator | 2026-04-07 01:58:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:43.010452 | orchestrator | 2026-04-07 01:58:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:43.011978 | orchestrator | 2026-04-07 01:58:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:43.012021 | orchestrator | 2026-04-07 01:58:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:46.061076 | orchestrator | 2026-04-07 01:58:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:46.062803 | orchestrator | 2026-04-07 01:58:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:46.062859 | orchestrator | 2026-04-07 01:58:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:49.113513 | orchestrator | 2026-04-07 01:58:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:49.116851 | orchestrator | 2026-04-07 01:58:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:49.116935 | orchestrator | 2026-04-07 01:58:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:52.170859 | orchestrator | 2026-04-07 01:58:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:52.171246 | orchestrator | 2026-04-07 01:58:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:52.171399 | orchestrator | 2026-04-07 01:58:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:55.218762 | orchestrator | 2026-04-07 01:58:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:55.220167 | orchestrator | 2026-04-07 01:58:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:55.220431 | orchestrator | 2026-04-07 01:58:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:58:58.271953 | orchestrator | 2026-04-07 01:58:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:58:58.274085 | orchestrator | 2026-04-07 01:58:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:58:58.274127 | orchestrator | 2026-04-07 01:58:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:01.325837 | orchestrator | 2026-04-07 01:59:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:01.327685 | orchestrator | 2026-04-07 01:59:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:01.327766 | orchestrator | 2026-04-07 01:59:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:04.376955 | orchestrator | 2026-04-07 01:59:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:04.378611 | orchestrator | 2026-04-07 01:59:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:04.378685 | orchestrator | 2026-04-07 01:59:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:07.427823 | orchestrator | 2026-04-07 01:59:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:07.428939 | orchestrator | 2026-04-07 01:59:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:07.428994 | orchestrator | 2026-04-07 01:59:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:10.472707 | orchestrator | 2026-04-07 01:59:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:10.473759 | orchestrator | 2026-04-07 01:59:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:10.473812 | orchestrator | 2026-04-07 01:59:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:13.520283 | orchestrator | 2026-04-07 01:59:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:13.522382 | orchestrator | 2026-04-07 01:59:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:13.522449 | orchestrator | 2026-04-07 01:59:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:16.568129 | orchestrator | 2026-04-07 01:59:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:16.570211 | orchestrator | 2026-04-07 01:59:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:16.570278 | orchestrator | 2026-04-07 01:59:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:19.618332 | orchestrator | 2026-04-07 01:59:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:19.620063 | orchestrator | 2026-04-07 01:59:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:19.620151 | orchestrator | 2026-04-07 01:59:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:22.670909 | orchestrator | 2026-04-07 01:59:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:22.672710 | orchestrator | 2026-04-07 01:59:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:22.672796 | orchestrator | 2026-04-07 01:59:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:25.713350 | orchestrator | 2026-04-07 01:59:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:25.714258 | orchestrator | 2026-04-07 01:59:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:25.714290 | orchestrator | 2026-04-07 01:59:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:28.760575 | orchestrator | 2026-04-07 01:59:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:28.762386 | orchestrator | 2026-04-07 01:59:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:28.762462 | orchestrator | 2026-04-07 01:59:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:31.805911 | orchestrator | 2026-04-07 01:59:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:31.807848 | orchestrator | 2026-04-07 01:59:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:31.807874 | orchestrator | 2026-04-07 01:59:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:34.847377 | orchestrator | 2026-04-07 01:59:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:34.848164 | orchestrator | 2026-04-07 01:59:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:34.848271 | orchestrator | 2026-04-07 01:59:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:37.901488 | orchestrator | 2026-04-07 01:59:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:37.902400 | orchestrator | 2026-04-07 01:59:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:37.902480 | orchestrator | 2026-04-07 01:59:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:40.959941 | orchestrator | 2026-04-07 01:59:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:40.961488 | orchestrator | 2026-04-07 01:59:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:40.961543 | orchestrator | 2026-04-07 01:59:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:44.013884 | orchestrator | 2026-04-07 01:59:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:44.015905 | orchestrator | 2026-04-07 01:59:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:44.015965 | orchestrator | 2026-04-07 01:59:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:47.071794 | orchestrator | 2026-04-07 01:59:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:47.073379 | orchestrator | 2026-04-07 01:59:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:47.073428 | orchestrator | 2026-04-07 01:59:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:50.130792 | orchestrator | 2026-04-07 01:59:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:50.133973 | orchestrator | 2026-04-07 01:59:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:50.134149 | orchestrator | 2026-04-07 01:59:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:53.186439 | orchestrator | 2026-04-07 01:59:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:53.187773 | orchestrator | 2026-04-07 01:59:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:53.187797 | orchestrator | 2026-04-07 01:59:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:56.239801 | orchestrator | 2026-04-07 01:59:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:56.241567 | orchestrator | 2026-04-07 01:59:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:56.241682 | orchestrator | 2026-04-07 01:59:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 01:59:59.286219 | orchestrator | 2026-04-07 01:59:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 01:59:59.287032 | orchestrator | 2026-04-07 01:59:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 01:59:59.287116 | orchestrator | 2026-04-07 01:59:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:02.330602 | orchestrator | 2026-04-07 02:00:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:02.331613 | orchestrator | 2026-04-07 02:00:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:02.331694 | orchestrator | 2026-04-07 02:00:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:05.371939 | orchestrator | 2026-04-07 02:00:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:05.373251 | orchestrator | 2026-04-07 02:00:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:05.373342 | orchestrator | 2026-04-07 02:00:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:08.423144 | orchestrator | 2026-04-07 02:00:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:08.425143 | orchestrator | 2026-04-07 02:00:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:08.425197 | orchestrator | 2026-04-07 02:00:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:11.472458 | orchestrator | 2026-04-07 02:00:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:11.473853 | orchestrator | 2026-04-07 02:00:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:11.474072 | orchestrator | 2026-04-07 02:00:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:14.515791 | orchestrator | 2026-04-07 02:00:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:14.517064 | orchestrator | 2026-04-07 02:00:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:14.517109 | orchestrator | 2026-04-07 02:00:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:17.553884 | orchestrator | 2026-04-07 02:00:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:17.555594 | orchestrator | 2026-04-07 02:00:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:17.555627 | orchestrator | 2026-04-07 02:00:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:20.599828 | orchestrator | 2026-04-07 02:00:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:20.601501 | orchestrator | 2026-04-07 02:00:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:20.601543 | orchestrator | 2026-04-07 02:00:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:23.648469 | orchestrator | 2026-04-07 02:00:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:23.651160 | orchestrator | 2026-04-07 02:00:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:23.651230 | orchestrator | 2026-04-07 02:00:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:26.692711 | orchestrator | 2026-04-07 02:00:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:26.693887 | orchestrator | 2026-04-07 02:00:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:26.693978 | orchestrator | 2026-04-07 02:00:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:29.740003 | orchestrator | 2026-04-07 02:00:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:29.741547 | orchestrator | 2026-04-07 02:00:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:29.741789 | orchestrator | 2026-04-07 02:00:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:32.787627 | orchestrator | 2026-04-07 02:00:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:32.789262 | orchestrator | 2026-04-07 02:00:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:32.789428 | orchestrator | 2026-04-07 02:00:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:35.832096 | orchestrator | 2026-04-07 02:00:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:35.834409 | orchestrator | 2026-04-07 02:00:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:35.834499 | orchestrator | 2026-04-07 02:00:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:38.876071 | orchestrator | 2026-04-07 02:00:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:38.878117 | orchestrator | 2026-04-07 02:00:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:38.878170 | orchestrator | 2026-04-07 02:00:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:41.915395 | orchestrator | 2026-04-07 02:00:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:41.916635 | orchestrator | 2026-04-07 02:00:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:41.916817 | orchestrator | 2026-04-07 02:00:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:44.960905 | orchestrator | 2026-04-07 02:00:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:44.962261 | orchestrator | 2026-04-07 02:00:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:44.962388 | orchestrator | 2026-04-07 02:00:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:48.009207 | orchestrator | 2026-04-07 02:00:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:48.012927 | orchestrator | 2026-04-07 02:00:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:48.012979 | orchestrator | 2026-04-07 02:00:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:51.055551 | orchestrator | 2026-04-07 02:00:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:51.056777 | orchestrator | 2026-04-07 02:00:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:51.056826 | orchestrator | 2026-04-07 02:00:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:54.106074 | orchestrator | 2026-04-07 02:00:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:54.107491 | orchestrator | 2026-04-07 02:00:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:54.107543 | orchestrator | 2026-04-07 02:00:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:00:57.153448 | orchestrator | 2026-04-07 02:00:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:00:57.154673 | orchestrator | 2026-04-07 02:00:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:00:57.154764 | orchestrator | 2026-04-07 02:00:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:00.204651 | orchestrator | 2026-04-07 02:01:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:00.206824 | orchestrator | 2026-04-07 02:01:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:00.206919 | orchestrator | 2026-04-07 02:01:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:03.258597 | orchestrator | 2026-04-07 02:01:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:03.259675 | orchestrator | 2026-04-07 02:01:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:03.259732 | orchestrator | 2026-04-07 02:01:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:06.306434 | orchestrator | 2026-04-07 02:01:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:06.308113 | orchestrator | 2026-04-07 02:01:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:06.308183 | orchestrator | 2026-04-07 02:01:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:09.353572 | orchestrator | 2026-04-07 02:01:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:09.355908 | orchestrator | 2026-04-07 02:01:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:09.356050 | orchestrator | 2026-04-07 02:01:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:12.401162 | orchestrator | 2026-04-07 02:01:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:12.403289 | orchestrator | 2026-04-07 02:01:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:12.403357 | orchestrator | 2026-04-07 02:01:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:15.448166 | orchestrator | 2026-04-07 02:01:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:15.448906 | orchestrator | 2026-04-07 02:01:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:15.448969 | orchestrator | 2026-04-07 02:01:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:18.491974 | orchestrator | 2026-04-07 02:01:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:18.493206 | orchestrator | 2026-04-07 02:01:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:18.493270 | orchestrator | 2026-04-07 02:01:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:21.545111 | orchestrator | 2026-04-07 02:01:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:21.546162 | orchestrator | 2026-04-07 02:01:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:21.546208 | orchestrator | 2026-04-07 02:01:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:24.592697 | orchestrator | 2026-04-07 02:01:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:24.593868 | orchestrator | 2026-04-07 02:01:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:24.593911 | orchestrator | 2026-04-07 02:01:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:27.643065 | orchestrator | 2026-04-07 02:01:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:27.645907 | orchestrator | 2026-04-07 02:01:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:27.645975 | orchestrator | 2026-04-07 02:01:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:30.689083 | orchestrator | 2026-04-07 02:01:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:30.689983 | orchestrator | 2026-04-07 02:01:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:30.690049 | orchestrator | 2026-04-07 02:01:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:33.739419 | orchestrator | 2026-04-07 02:01:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:33.741139 | orchestrator | 2026-04-07 02:01:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:33.741199 | orchestrator | 2026-04-07 02:01:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:36.794390 | orchestrator | 2026-04-07 02:01:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:36.795234 | orchestrator | 2026-04-07 02:01:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:36.795323 | orchestrator | 2026-04-07 02:01:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:39.844735 | orchestrator | 2026-04-07 02:01:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:39.845721 | orchestrator | 2026-04-07 02:01:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:39.845783 | orchestrator | 2026-04-07 02:01:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:42.895966 | orchestrator | 2026-04-07 02:01:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:42.897942 | orchestrator | 2026-04-07 02:01:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:42.898003 | orchestrator | 2026-04-07 02:01:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:45.947730 | orchestrator | 2026-04-07 02:01:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:45.949952 | orchestrator | 2026-04-07 02:01:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:45.950095 | orchestrator | 2026-04-07 02:01:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:48.999830 | orchestrator | 2026-04-07 02:01:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:49.002101 | orchestrator | 2026-04-07 02:01:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:49.002166 | orchestrator | 2026-04-07 02:01:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:52.049032 | orchestrator | 2026-04-07 02:01:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:52.050670 | orchestrator | 2026-04-07 02:01:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:52.051055 | orchestrator | 2026-04-07 02:01:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:55.096643 | orchestrator | 2026-04-07 02:01:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:55.097455 | orchestrator | 2026-04-07 02:01:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:55.097529 | orchestrator | 2026-04-07 02:01:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:01:58.145069 | orchestrator | 2026-04-07 02:01:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:01:58.146753 | orchestrator | 2026-04-07 02:01:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:01:58.146896 | orchestrator | 2026-04-07 02:01:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:01.184835 | orchestrator | 2026-04-07 02:02:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:01.185286 | orchestrator | 2026-04-07 02:02:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:01.185389 | orchestrator | 2026-04-07 02:02:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:04.232023 | orchestrator | 2026-04-07 02:02:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:04.232955 | orchestrator | 2026-04-07 02:02:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:04.233082 | orchestrator | 2026-04-07 02:02:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:07.286531 | orchestrator | 2026-04-07 02:02:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:07.288129 | orchestrator | 2026-04-07 02:02:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:07.288183 | orchestrator | 2026-04-07 02:02:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:10.336931 | orchestrator | 2026-04-07 02:02:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:10.339514 | orchestrator | 2026-04-07 02:02:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:10.339581 | orchestrator | 2026-04-07 02:02:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:13.386911 | orchestrator | 2026-04-07 02:02:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:13.387986 | orchestrator | 2026-04-07 02:02:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:13.388018 | orchestrator | 2026-04-07 02:02:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:16.430324 | orchestrator | 2026-04-07 02:02:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:16.431871 | orchestrator | 2026-04-07 02:02:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:16.431926 | orchestrator | 2026-04-07 02:02:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:19.480193 | orchestrator | 2026-04-07 02:02:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:19.482007 | orchestrator | 2026-04-07 02:02:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:19.482128 | orchestrator | 2026-04-07 02:02:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:22.531042 | orchestrator | 2026-04-07 02:02:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:22.532818 | orchestrator | 2026-04-07 02:02:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:22.532996 | orchestrator | 2026-04-07 02:02:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:25.577171 | orchestrator | 2026-04-07 02:02:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:25.578145 | orchestrator | 2026-04-07 02:02:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:25.578187 | orchestrator | 2026-04-07 02:02:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:28.626563 | orchestrator | 2026-04-07 02:02:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:28.628102 | orchestrator | 2026-04-07 02:02:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:28.628147 | orchestrator | 2026-04-07 02:02:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:31.676738 | orchestrator | 2026-04-07 02:02:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:31.679034 | orchestrator | 2026-04-07 02:02:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:31.679082 | orchestrator | 2026-04-07 02:02:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:34.728026 | orchestrator | 2026-04-07 02:02:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:34.729390 | orchestrator | 2026-04-07 02:02:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:34.729478 | orchestrator | 2026-04-07 02:02:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:37.768870 | orchestrator | 2026-04-07 02:02:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:37.769522 | orchestrator | 2026-04-07 02:02:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:37.769620 | orchestrator | 2026-04-07 02:02:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:40.815940 | orchestrator | 2026-04-07 02:02:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:40.817450 | orchestrator | 2026-04-07 02:02:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:40.817656 | orchestrator | 2026-04-07 02:02:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:43.857598 | orchestrator | 2026-04-07 02:02:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:43.860056 | orchestrator | 2026-04-07 02:02:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:43.860109 | orchestrator | 2026-04-07 02:02:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:46.904133 | orchestrator | 2026-04-07 02:02:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:46.905110 | orchestrator | 2026-04-07 02:02:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:46.905157 | orchestrator | 2026-04-07 02:02:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:49.950475 | orchestrator | 2026-04-07 02:02:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:49.950927 | orchestrator | 2026-04-07 02:02:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:49.950965 | orchestrator | 2026-04-07 02:02:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:52.999179 | orchestrator | 2026-04-07 02:02:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:52.999925 | orchestrator | 2026-04-07 02:02:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:52.999961 | orchestrator | 2026-04-07 02:02:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:56.047170 | orchestrator | 2026-04-07 02:02:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:56.047278 | orchestrator | 2026-04-07 02:02:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:56.048061 | orchestrator | 2026-04-07 02:02:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:02:59.098668 | orchestrator | 2026-04-07 02:02:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:02:59.100467 | orchestrator | 2026-04-07 02:02:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:02:59.100511 | orchestrator | 2026-04-07 02:02:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:02.145210 | orchestrator | 2026-04-07 02:03:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:02.146316 | orchestrator | 2026-04-07 02:03:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:02.146617 | orchestrator | 2026-04-07 02:03:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:05.192343 | orchestrator | 2026-04-07 02:03:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:05.193871 | orchestrator | 2026-04-07 02:03:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:05.193917 | orchestrator | 2026-04-07 02:03:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:08.237740 | orchestrator | 2026-04-07 02:03:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:08.238933 | orchestrator | 2026-04-07 02:03:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:08.239009 | orchestrator | 2026-04-07 02:03:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:11.286587 | orchestrator | 2026-04-07 02:03:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:11.287118 | orchestrator | 2026-04-07 02:03:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:11.287164 | orchestrator | 2026-04-07 02:03:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:14.340219 | orchestrator | 2026-04-07 02:03:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:14.341607 | orchestrator | 2026-04-07 02:03:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:14.341650 | orchestrator | 2026-04-07 02:03:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:17.393975 | orchestrator | 2026-04-07 02:03:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:17.395396 | orchestrator | 2026-04-07 02:03:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:17.395461 | orchestrator | 2026-04-07 02:03:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:20.450475 | orchestrator | 2026-04-07 02:03:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:20.451825 | orchestrator | 2026-04-07 02:03:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:20.451903 | orchestrator | 2026-04-07 02:03:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:23.503961 | orchestrator | 2026-04-07 02:03:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:23.505494 | orchestrator | 2026-04-07 02:03:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:23.505526 | orchestrator | 2026-04-07 02:03:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:26.552829 | orchestrator | 2026-04-07 02:03:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:26.555091 | orchestrator | 2026-04-07 02:03:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:26.555152 | orchestrator | 2026-04-07 02:03:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:29.609953 | orchestrator | 2026-04-07 02:03:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:29.612655 | orchestrator | 2026-04-07 02:03:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:29.612836 | orchestrator | 2026-04-07 02:03:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:32.661520 | orchestrator | 2026-04-07 02:03:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:32.664965 | orchestrator | 2026-04-07 02:03:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:32.665033 | orchestrator | 2026-04-07 02:03:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:35.713417 | orchestrator | 2026-04-07 02:03:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:35.717435 | orchestrator | 2026-04-07 02:03:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:35.717514 | orchestrator | 2026-04-07 02:03:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:38.772537 | orchestrator | 2026-04-07 02:03:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:38.774198 | orchestrator | 2026-04-07 02:03:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:38.774268 | orchestrator | 2026-04-07 02:03:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:41.823402 | orchestrator | 2026-04-07 02:03:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:41.825165 | orchestrator | 2026-04-07 02:03:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:41.825214 | orchestrator | 2026-04-07 02:03:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:44.867265 | orchestrator | 2026-04-07 02:03:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:44.868403 | orchestrator | 2026-04-07 02:03:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:44.868494 | orchestrator | 2026-04-07 02:03:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:47.914686 | orchestrator | 2026-04-07 02:03:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:47.916212 | orchestrator | 2026-04-07 02:03:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:47.916248 | orchestrator | 2026-04-07 02:03:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:50.963854 | orchestrator | 2026-04-07 02:03:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:50.963982 | orchestrator | 2026-04-07 02:03:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:50.964023 | orchestrator | 2026-04-07 02:03:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:54.015673 | orchestrator | 2026-04-07 02:03:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:54.017903 | orchestrator | 2026-04-07 02:03:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:54.017954 | orchestrator | 2026-04-07 02:03:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:03:57.059306 | orchestrator | 2026-04-07 02:03:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:03:57.059678 | orchestrator | 2026-04-07 02:03:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:03:57.059739 | orchestrator | 2026-04-07 02:03:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:00.106447 | orchestrator | 2026-04-07 02:04:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:00.106939 | orchestrator | 2026-04-07 02:04:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:00.106969 | orchestrator | 2026-04-07 02:04:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:03.153290 | orchestrator | 2026-04-07 02:04:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:03.155496 | orchestrator | 2026-04-07 02:04:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:03.155554 | orchestrator | 2026-04-07 02:04:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:06.194292 | orchestrator | 2026-04-07 02:04:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:06.195444 | orchestrator | 2026-04-07 02:04:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:06.195656 | orchestrator | 2026-04-07 02:04:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:09.249744 | orchestrator | 2026-04-07 02:04:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:09.252382 | orchestrator | 2026-04-07 02:04:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:09.252444 | orchestrator | 2026-04-07 02:04:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:12.319136 | orchestrator | 2026-04-07 02:04:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:12.320827 | orchestrator | 2026-04-07 02:04:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:12.320951 | orchestrator | 2026-04-07 02:04:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:15.373940 | orchestrator | 2026-04-07 02:04:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:15.376010 | orchestrator | 2026-04-07 02:04:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:15.376089 | orchestrator | 2026-04-07 02:04:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:18.432241 | orchestrator | 2026-04-07 02:04:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:18.433695 | orchestrator | 2026-04-07 02:04:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:18.433746 | orchestrator | 2026-04-07 02:04:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:21.484643 | orchestrator | 2026-04-07 02:04:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:21.484886 | orchestrator | 2026-04-07 02:04:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:21.485138 | orchestrator | 2026-04-07 02:04:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:24.541394 | orchestrator | 2026-04-07 02:04:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:24.543087 | orchestrator | 2026-04-07 02:04:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:24.543129 | orchestrator | 2026-04-07 02:04:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:27.594613 | orchestrator | 2026-04-07 02:04:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:27.596085 | orchestrator | 2026-04-07 02:04:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:27.596134 | orchestrator | 2026-04-07 02:04:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:30.645317 | orchestrator | 2026-04-07 02:04:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:30.648444 | orchestrator | 2026-04-07 02:04:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:30.648501 | orchestrator | 2026-04-07 02:04:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:33.693457 | orchestrator | 2026-04-07 02:04:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:33.695084 | orchestrator | 2026-04-07 02:04:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:33.696401 | orchestrator | 2026-04-07 02:04:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:04:36.740812 | orchestrator | 2026-04-07 02:04:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:04:36.743443 | orchestrator | 2026-04-07 02:04:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:04:36.743511 | orchestrator | 2026-04-07 02:04:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:39.890914 | orchestrator | 2026-04-07 02:06:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:39.892152 | orchestrator | 2026-04-07 02:06:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:39.892497 | orchestrator | 2026-04-07 02:06:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:42.937269 | orchestrator | 2026-04-07 02:06:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:42.939157 | orchestrator | 2026-04-07 02:06:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:42.939193 | orchestrator | 2026-04-07 02:06:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:45.987893 | orchestrator | 2026-04-07 02:06:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:45.988977 | orchestrator | 2026-04-07 02:06:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:45.989050 | orchestrator | 2026-04-07 02:06:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:49.029870 | orchestrator | 2026-04-07 02:06:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:49.031508 | orchestrator | 2026-04-07 02:06:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:49.031566 | orchestrator | 2026-04-07 02:06:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:52.074931 | orchestrator | 2026-04-07 02:06:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:52.077138 | orchestrator | 2026-04-07 02:06:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:52.077225 | orchestrator | 2026-04-07 02:06:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:55.124165 | orchestrator | 2026-04-07 02:06:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:55.125807 | orchestrator | 2026-04-07 02:06:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:55.125872 | orchestrator | 2026-04-07 02:06:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:06:58.164077 | orchestrator | 2026-04-07 02:06:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:06:58.168868 | orchestrator | 2026-04-07 02:06:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:06:58.169012 | orchestrator | 2026-04-07 02:06:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:01.217814 | orchestrator | 2026-04-07 02:07:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:01.220526 | orchestrator | 2026-04-07 02:07:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:01.220601 | orchestrator | 2026-04-07 02:07:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:04.266570 | orchestrator | 2026-04-07 02:07:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:04.267825 | orchestrator | 2026-04-07 02:07:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:04.267854 | orchestrator | 2026-04-07 02:07:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:07.309008 | orchestrator | 2026-04-07 02:07:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:07.310422 | orchestrator | 2026-04-07 02:07:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:07.310466 | orchestrator | 2026-04-07 02:07:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:10.350334 | orchestrator | 2026-04-07 02:07:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:10.352207 | orchestrator | 2026-04-07 02:07:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:10.352255 | orchestrator | 2026-04-07 02:07:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:13.394268 | orchestrator | 2026-04-07 02:07:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:13.395105 | orchestrator | 2026-04-07 02:07:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:13.395138 | orchestrator | 2026-04-07 02:07:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:16.443098 | orchestrator | 2026-04-07 02:07:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:16.446080 | orchestrator | 2026-04-07 02:07:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:16.446140 | orchestrator | 2026-04-07 02:07:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:19.490591 | orchestrator | 2026-04-07 02:07:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:19.492600 | orchestrator | 2026-04-07 02:07:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:19.492678 | orchestrator | 2026-04-07 02:07:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:22.537238 | orchestrator | 2026-04-07 02:07:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:22.539126 | orchestrator | 2026-04-07 02:07:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:22.539189 | orchestrator | 2026-04-07 02:07:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:25.578103 | orchestrator | 2026-04-07 02:07:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:25.580258 | orchestrator | 2026-04-07 02:07:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:25.580324 | orchestrator | 2026-04-07 02:07:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:28.621346 | orchestrator | 2026-04-07 02:07:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:28.622376 | orchestrator | 2026-04-07 02:07:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:28.622555 | orchestrator | 2026-04-07 02:07:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:31.662471 | orchestrator | 2026-04-07 02:07:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:31.664290 | orchestrator | 2026-04-07 02:07:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:31.664333 | orchestrator | 2026-04-07 02:07:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:34.702893 | orchestrator | 2026-04-07 02:07:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:34.705674 | orchestrator | 2026-04-07 02:07:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:34.705750 | orchestrator | 2026-04-07 02:07:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:37.747517 | orchestrator | 2026-04-07 02:07:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:37.749864 | orchestrator | 2026-04-07 02:07:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:37.749933 | orchestrator | 2026-04-07 02:07:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:40.789864 | orchestrator | 2026-04-07 02:07:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:40.790128 | orchestrator | 2026-04-07 02:07:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:40.790160 | orchestrator | 2026-04-07 02:07:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:43.834008 | orchestrator | 2026-04-07 02:07:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:43.836273 | orchestrator | 2026-04-07 02:07:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:43.836336 | orchestrator | 2026-04-07 02:07:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:46.879456 | orchestrator | 2026-04-07 02:07:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:46.880549 | orchestrator | 2026-04-07 02:07:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:46.880582 | orchestrator | 2026-04-07 02:07:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:49.919466 | orchestrator | 2026-04-07 02:07:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:49.919964 | orchestrator | 2026-04-07 02:07:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:49.920058 | orchestrator | 2026-04-07 02:07:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:52.966913 | orchestrator | 2026-04-07 02:07:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:52.968808 | orchestrator | 2026-04-07 02:07:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:52.968851 | orchestrator | 2026-04-07 02:07:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:56.017385 | orchestrator | 2026-04-07 02:07:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:56.018585 | orchestrator | 2026-04-07 02:07:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:56.019770 | orchestrator | 2026-04-07 02:07:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:07:59.075279 | orchestrator | 2026-04-07 02:07:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:07:59.076492 | orchestrator | 2026-04-07 02:07:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:07:59.076825 | orchestrator | 2026-04-07 02:07:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:02.127603 | orchestrator | 2026-04-07 02:08:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:02.131960 | orchestrator | 2026-04-07 02:08:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:02.132105 | orchestrator | 2026-04-07 02:08:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:05.168114 | orchestrator | 2026-04-07 02:08:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:05.169876 | orchestrator | 2026-04-07 02:08:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:05.169966 | orchestrator | 2026-04-07 02:08:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:08.212587 | orchestrator | 2026-04-07 02:08:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:08.214199 | orchestrator | 2026-04-07 02:08:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:08.214276 | orchestrator | 2026-04-07 02:08:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:11.253554 | orchestrator | 2026-04-07 02:08:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:11.254634 | orchestrator | 2026-04-07 02:08:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:11.254678 | orchestrator | 2026-04-07 02:08:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:14.300244 | orchestrator | 2026-04-07 02:08:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:14.303185 | orchestrator | 2026-04-07 02:08:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:14.303282 | orchestrator | 2026-04-07 02:08:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:17.351969 | orchestrator | 2026-04-07 02:08:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:17.354448 | orchestrator | 2026-04-07 02:08:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:17.354535 | orchestrator | 2026-04-07 02:08:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:20.388686 | orchestrator | 2026-04-07 02:08:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:20.388938 | orchestrator | 2026-04-07 02:08:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:20.388980 | orchestrator | 2026-04-07 02:08:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:23.425198 | orchestrator | 2026-04-07 02:08:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:23.425743 | orchestrator | 2026-04-07 02:08:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:23.425843 | orchestrator | 2026-04-07 02:08:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:26.460371 | orchestrator | 2026-04-07 02:08:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:26.460775 | orchestrator | 2026-04-07 02:08:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:26.460811 | orchestrator | 2026-04-07 02:08:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:29.511690 | orchestrator | 2026-04-07 02:08:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:29.513155 | orchestrator | 2026-04-07 02:08:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:29.513207 | orchestrator | 2026-04-07 02:08:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:32.556468 | orchestrator | 2026-04-07 02:08:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:32.558662 | orchestrator | 2026-04-07 02:08:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:32.558742 | orchestrator | 2026-04-07 02:08:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:35.607864 | orchestrator | 2026-04-07 02:08:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:35.609076 | orchestrator | 2026-04-07 02:08:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:35.609172 | orchestrator | 2026-04-07 02:08:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:38.652420 | orchestrator | 2026-04-07 02:08:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:38.654976 | orchestrator | 2026-04-07 02:08:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:38.655076 | orchestrator | 2026-04-07 02:08:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:41.701849 | orchestrator | 2026-04-07 02:08:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:41.703907 | orchestrator | 2026-04-07 02:08:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:41.703983 | orchestrator | 2026-04-07 02:08:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:44.753204 | orchestrator | 2026-04-07 02:08:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:44.756481 | orchestrator | 2026-04-07 02:08:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:44.756572 | orchestrator | 2026-04-07 02:08:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:47.807359 | orchestrator | 2026-04-07 02:08:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:47.809367 | orchestrator | 2026-04-07 02:08:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:47.809417 | orchestrator | 2026-04-07 02:08:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:50.857838 | orchestrator | 2026-04-07 02:08:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:50.861293 | orchestrator | 2026-04-07 02:08:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:50.861479 | orchestrator | 2026-04-07 02:08:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:53.908954 | orchestrator | 2026-04-07 02:08:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:53.911142 | orchestrator | 2026-04-07 02:08:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:53.911198 | orchestrator | 2026-04-07 02:08:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:08:56.960904 | orchestrator | 2026-04-07 02:08:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:08:56.963291 | orchestrator | 2026-04-07 02:08:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:08:56.963405 | orchestrator | 2026-04-07 02:08:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:00.013570 | orchestrator | 2026-04-07 02:09:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:00.016406 | orchestrator | 2026-04-07 02:09:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:00.016476 | orchestrator | 2026-04-07 02:09:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:03.064949 | orchestrator | 2026-04-07 02:09:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:03.067619 | orchestrator | 2026-04-07 02:09:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:03.067692 | orchestrator | 2026-04-07 02:09:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:06.111479 | orchestrator | 2026-04-07 02:09:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:06.113423 | orchestrator | 2026-04-07 02:09:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:06.113714 | orchestrator | 2026-04-07 02:09:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:09.154820 | orchestrator | 2026-04-07 02:09:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:09.155880 | orchestrator | 2026-04-07 02:09:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:09.155969 | orchestrator | 2026-04-07 02:09:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:12.202934 | orchestrator | 2026-04-07 02:09:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:12.204736 | orchestrator | 2026-04-07 02:09:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:12.204796 | orchestrator | 2026-04-07 02:09:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:15.255886 | orchestrator | 2026-04-07 02:09:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:15.258223 | orchestrator | 2026-04-07 02:09:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:15.258276 | orchestrator | 2026-04-07 02:09:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:18.307535 | orchestrator | 2026-04-07 02:09:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:18.310118 | orchestrator | 2026-04-07 02:09:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:18.310183 | orchestrator | 2026-04-07 02:09:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:21.351235 | orchestrator | 2026-04-07 02:09:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:21.352122 | orchestrator | 2026-04-07 02:09:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:21.352369 | orchestrator | 2026-04-07 02:09:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:24.403157 | orchestrator | 2026-04-07 02:09:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:24.404902 | orchestrator | 2026-04-07 02:09:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:24.404959 | orchestrator | 2026-04-07 02:09:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:27.447675 | orchestrator | 2026-04-07 02:09:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:27.449683 | orchestrator | 2026-04-07 02:09:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:27.450115 | orchestrator | 2026-04-07 02:09:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:30.499074 | orchestrator | 2026-04-07 02:09:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:30.499166 | orchestrator | 2026-04-07 02:09:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:30.499176 | orchestrator | 2026-04-07 02:09:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:33.549489 | orchestrator | 2026-04-07 02:09:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:33.550782 | orchestrator | 2026-04-07 02:09:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:33.550841 | orchestrator | 2026-04-07 02:09:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:36.600357 | orchestrator | 2026-04-07 02:09:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:36.602382 | orchestrator | 2026-04-07 02:09:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:36.602452 | orchestrator | 2026-04-07 02:09:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:39.652336 | orchestrator | 2026-04-07 02:09:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:39.653849 | orchestrator | 2026-04-07 02:09:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:39.653915 | orchestrator | 2026-04-07 02:09:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:42.699267 | orchestrator | 2026-04-07 02:09:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:42.701728 | orchestrator | 2026-04-07 02:09:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:42.701823 | orchestrator | 2026-04-07 02:09:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:45.749438 | orchestrator | 2026-04-07 02:09:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:45.750805 | orchestrator | 2026-04-07 02:09:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:45.750858 | orchestrator | 2026-04-07 02:09:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:48.799228 | orchestrator | 2026-04-07 02:09:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:48.800081 | orchestrator | 2026-04-07 02:09:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:48.800119 | orchestrator | 2026-04-07 02:09:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:51.846964 | orchestrator | 2026-04-07 02:09:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:51.849018 | orchestrator | 2026-04-07 02:09:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:51.849174 | orchestrator | 2026-04-07 02:09:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:54.896996 | orchestrator | 2026-04-07 02:09:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:54.899436 | orchestrator | 2026-04-07 02:09:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:54.899523 | orchestrator | 2026-04-07 02:09:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:09:57.947153 | orchestrator | 2026-04-07 02:09:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:09:57.948701 | orchestrator | 2026-04-07 02:09:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:09:57.948781 | orchestrator | 2026-04-07 02:09:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:00.996167 | orchestrator | 2026-04-07 02:10:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:00.999323 | orchestrator | 2026-04-07 02:10:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:00.999377 | orchestrator | 2026-04-07 02:10:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:04.053478 | orchestrator | 2026-04-07 02:10:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:04.054802 | orchestrator | 2026-04-07 02:10:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:04.054957 | orchestrator | 2026-04-07 02:10:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:07.103400 | orchestrator | 2026-04-07 02:10:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:07.106379 | orchestrator | 2026-04-07 02:10:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:07.106454 | orchestrator | 2026-04-07 02:10:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:10.154286 | orchestrator | 2026-04-07 02:10:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:10.156804 | orchestrator | 2026-04-07 02:10:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:10.157120 | orchestrator | 2026-04-07 02:10:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:13.204180 | orchestrator | 2026-04-07 02:10:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:13.205796 | orchestrator | 2026-04-07 02:10:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:13.205875 | orchestrator | 2026-04-07 02:10:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:16.249937 | orchestrator | 2026-04-07 02:10:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:16.251913 | orchestrator | 2026-04-07 02:10:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:16.251988 | orchestrator | 2026-04-07 02:10:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:19.300712 | orchestrator | 2026-04-07 02:10:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:19.303221 | orchestrator | 2026-04-07 02:10:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:19.303267 | orchestrator | 2026-04-07 02:10:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:22.347206 | orchestrator | 2026-04-07 02:10:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:22.348264 | orchestrator | 2026-04-07 02:10:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:22.348294 | orchestrator | 2026-04-07 02:10:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:25.393752 | orchestrator | 2026-04-07 02:10:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:25.395149 | orchestrator | 2026-04-07 02:10:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:25.395545 | orchestrator | 2026-04-07 02:10:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:28.439016 | orchestrator | 2026-04-07 02:10:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:28.440248 | orchestrator | 2026-04-07 02:10:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:28.440446 | orchestrator | 2026-04-07 02:10:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:31.483345 | orchestrator | 2026-04-07 02:10:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:31.486715 | orchestrator | 2026-04-07 02:10:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:31.486780 | orchestrator | 2026-04-07 02:10:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:34.539752 | orchestrator | 2026-04-07 02:10:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:34.540682 | orchestrator | 2026-04-07 02:10:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:34.540709 | orchestrator | 2026-04-07 02:10:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:37.587009 | orchestrator | 2026-04-07 02:10:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:37.588436 | orchestrator | 2026-04-07 02:10:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:37.588613 | orchestrator | 2026-04-07 02:10:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:40.636875 | orchestrator | 2026-04-07 02:10:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:40.639663 | orchestrator | 2026-04-07 02:10:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:40.639753 | orchestrator | 2026-04-07 02:10:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:43.684916 | orchestrator | 2026-04-07 02:10:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:43.686475 | orchestrator | 2026-04-07 02:10:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:43.686531 | orchestrator | 2026-04-07 02:10:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:46.736514 | orchestrator | 2026-04-07 02:10:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:46.738178 | orchestrator | 2026-04-07 02:10:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:46.738246 | orchestrator | 2026-04-07 02:10:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:49.784038 | orchestrator | 2026-04-07 02:10:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:49.784829 | orchestrator | 2026-04-07 02:10:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:49.784930 | orchestrator | 2026-04-07 02:10:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:52.829921 | orchestrator | 2026-04-07 02:10:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:52.831680 | orchestrator | 2026-04-07 02:10:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:52.831768 | orchestrator | 2026-04-07 02:10:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:55.878660 | orchestrator | 2026-04-07 02:10:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:55.879978 | orchestrator | 2026-04-07 02:10:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:55.880033 | orchestrator | 2026-04-07 02:10:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:10:58.926628 | orchestrator | 2026-04-07 02:10:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:10:58.927922 | orchestrator | 2026-04-07 02:10:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:10:58.927965 | orchestrator | 2026-04-07 02:10:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:01.976877 | orchestrator | 2026-04-07 02:11:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:01.978219 | orchestrator | 2026-04-07 02:11:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:01.978398 | orchestrator | 2026-04-07 02:11:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:05.032994 | orchestrator | 2026-04-07 02:11:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:05.033589 | orchestrator | 2026-04-07 02:11:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:05.033626 | orchestrator | 2026-04-07 02:11:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:08.089750 | orchestrator | 2026-04-07 02:11:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:08.092281 | orchestrator | 2026-04-07 02:11:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:08.092342 | orchestrator | 2026-04-07 02:11:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:11.147206 | orchestrator | 2026-04-07 02:11:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:11.149227 | orchestrator | 2026-04-07 02:11:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:11.149293 | orchestrator | 2026-04-07 02:11:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:14.194471 | orchestrator | 2026-04-07 02:11:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:14.196020 | orchestrator | 2026-04-07 02:11:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:14.196123 | orchestrator | 2026-04-07 02:11:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:17.247681 | orchestrator | 2026-04-07 02:11:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:17.250265 | orchestrator | 2026-04-07 02:11:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:17.250330 | orchestrator | 2026-04-07 02:11:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:20.298883 | orchestrator | 2026-04-07 02:11:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:20.300728 | orchestrator | 2026-04-07 02:11:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:20.300776 | orchestrator | 2026-04-07 02:11:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:23.345402 | orchestrator | 2026-04-07 02:11:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:23.347618 | orchestrator | 2026-04-07 02:11:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:23.347691 | orchestrator | 2026-04-07 02:11:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:26.392125 | orchestrator | 2026-04-07 02:11:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:26.394346 | orchestrator | 2026-04-07 02:11:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:26.394389 | orchestrator | 2026-04-07 02:11:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:29.437921 | orchestrator | 2026-04-07 02:11:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:29.440029 | orchestrator | 2026-04-07 02:11:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:29.440189 | orchestrator | 2026-04-07 02:11:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:32.481955 | orchestrator | 2026-04-07 02:11:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:32.483110 | orchestrator | 2026-04-07 02:11:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:32.483180 | orchestrator | 2026-04-07 02:11:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:35.534445 | orchestrator | 2026-04-07 02:11:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:35.535657 | orchestrator | 2026-04-07 02:11:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:35.535881 | orchestrator | 2026-04-07 02:11:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:38.580709 | orchestrator | 2026-04-07 02:11:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:38.583175 | orchestrator | 2026-04-07 02:11:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:38.583250 | orchestrator | 2026-04-07 02:11:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:41.630625 | orchestrator | 2026-04-07 02:11:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:41.635740 | orchestrator | 2026-04-07 02:11:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:41.635823 | orchestrator | 2026-04-07 02:11:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:44.686630 | orchestrator | 2026-04-07 02:11:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:44.688353 | orchestrator | 2026-04-07 02:11:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:44.688469 | orchestrator | 2026-04-07 02:11:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:47.728603 | orchestrator | 2026-04-07 02:11:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:47.730887 | orchestrator | 2026-04-07 02:11:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:47.730982 | orchestrator | 2026-04-07 02:11:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:50.772015 | orchestrator | 2026-04-07 02:11:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:50.773737 | orchestrator | 2026-04-07 02:11:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:50.773796 | orchestrator | 2026-04-07 02:11:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:53.812513 | orchestrator | 2026-04-07 02:11:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:53.813285 | orchestrator | 2026-04-07 02:11:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:53.813321 | orchestrator | 2026-04-07 02:11:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:56.861438 | orchestrator | 2026-04-07 02:11:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:56.862239 | orchestrator | 2026-04-07 02:11:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:56.862279 | orchestrator | 2026-04-07 02:11:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:11:59.905277 | orchestrator | 2026-04-07 02:11:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:11:59.906577 | orchestrator | 2026-04-07 02:11:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:11:59.906628 | orchestrator | 2026-04-07 02:11:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:02.950158 | orchestrator | 2026-04-07 02:12:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:02.951395 | orchestrator | 2026-04-07 02:12:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:02.951432 | orchestrator | 2026-04-07 02:12:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:06.011908 | orchestrator | 2026-04-07 02:12:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:06.013881 | orchestrator | 2026-04-07 02:12:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:06.013908 | orchestrator | 2026-04-07 02:12:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:09.064024 | orchestrator | 2026-04-07 02:12:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:09.066201 | orchestrator | 2026-04-07 02:12:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:09.066259 | orchestrator | 2026-04-07 02:12:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:12.107440 | orchestrator | 2026-04-07 02:12:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:12.108930 | orchestrator | 2026-04-07 02:12:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:12.108997 | orchestrator | 2026-04-07 02:12:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:15.154519 | orchestrator | 2026-04-07 02:12:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:15.157479 | orchestrator | 2026-04-07 02:12:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:15.157544 | orchestrator | 2026-04-07 02:12:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:18.207527 | orchestrator | 2026-04-07 02:12:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:18.208538 | orchestrator | 2026-04-07 02:12:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:18.208598 | orchestrator | 2026-04-07 02:12:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:21.269747 | orchestrator | 2026-04-07 02:12:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:21.272788 | orchestrator | 2026-04-07 02:12:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:21.273205 | orchestrator | 2026-04-07 02:12:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:24.316708 | orchestrator | 2026-04-07 02:12:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:24.319384 | orchestrator | 2026-04-07 02:12:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:24.319471 | orchestrator | 2026-04-07 02:12:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:27.361982 | orchestrator | 2026-04-07 02:12:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:27.363087 | orchestrator | 2026-04-07 02:12:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:27.363139 | orchestrator | 2026-04-07 02:12:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:30.412928 | orchestrator | 2026-04-07 02:12:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:30.414675 | orchestrator | 2026-04-07 02:12:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:30.414757 | orchestrator | 2026-04-07 02:12:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:33.457810 | orchestrator | 2026-04-07 02:12:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:33.460053 | orchestrator | 2026-04-07 02:12:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:33.460121 | orchestrator | 2026-04-07 02:12:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:36.509272 | orchestrator | 2026-04-07 02:12:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:36.511799 | orchestrator | 2026-04-07 02:12:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:36.512041 | orchestrator | 2026-04-07 02:12:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:39.551996 | orchestrator | 2026-04-07 02:12:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:39.553678 | orchestrator | 2026-04-07 02:12:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:39.553731 | orchestrator | 2026-04-07 02:12:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:42.601955 | orchestrator | 2026-04-07 02:12:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:42.604139 | orchestrator | 2026-04-07 02:12:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:42.604682 | orchestrator | 2026-04-07 02:12:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:45.655288 | orchestrator | 2026-04-07 02:12:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:45.658381 | orchestrator | 2026-04-07 02:12:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:45.658468 | orchestrator | 2026-04-07 02:12:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:48.708621 | orchestrator | 2026-04-07 02:12:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:48.711742 | orchestrator | 2026-04-07 02:12:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:48.711958 | orchestrator | 2026-04-07 02:12:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:51.759088 | orchestrator | 2026-04-07 02:12:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:51.760423 | orchestrator | 2026-04-07 02:12:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:51.760452 | orchestrator | 2026-04-07 02:12:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:54.805146 | orchestrator | 2026-04-07 02:12:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:54.807740 | orchestrator | 2026-04-07 02:12:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:54.807824 | orchestrator | 2026-04-07 02:12:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:12:57.857512 | orchestrator | 2026-04-07 02:12:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:12:57.860056 | orchestrator | 2026-04-07 02:12:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:12:57.860181 | orchestrator | 2026-04-07 02:12:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:00.907535 | orchestrator | 2026-04-07 02:13:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:00.910613 | orchestrator | 2026-04-07 02:13:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:00.910700 | orchestrator | 2026-04-07 02:13:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:03.956722 | orchestrator | 2026-04-07 02:13:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:03.957314 | orchestrator | 2026-04-07 02:13:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:03.957731 | orchestrator | 2026-04-07 02:13:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:07.008657 | orchestrator | 2026-04-07 02:13:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:07.010867 | orchestrator | 2026-04-07 02:13:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:07.010964 | orchestrator | 2026-04-07 02:13:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:10.063833 | orchestrator | 2026-04-07 02:13:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:10.065217 | orchestrator | 2026-04-07 02:13:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:10.065313 | orchestrator | 2026-04-07 02:13:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:13.120177 | orchestrator | 2026-04-07 02:13:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:13.121676 | orchestrator | 2026-04-07 02:13:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:13.122148 | orchestrator | 2026-04-07 02:13:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:16.176413 | orchestrator | 2026-04-07 02:13:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:16.177914 | orchestrator | 2026-04-07 02:13:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:16.178210 | orchestrator | 2026-04-07 02:13:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:19.226539 | orchestrator | 2026-04-07 02:13:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:19.228392 | orchestrator | 2026-04-07 02:13:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:19.228474 | orchestrator | 2026-04-07 02:13:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:22.280905 | orchestrator | 2026-04-07 02:13:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:22.282831 | orchestrator | 2026-04-07 02:13:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:22.283030 | orchestrator | 2026-04-07 02:13:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:25.337714 | orchestrator | 2026-04-07 02:13:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:25.340662 | orchestrator | 2026-04-07 02:13:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:25.340839 | orchestrator | 2026-04-07 02:13:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:28.388820 | orchestrator | 2026-04-07 02:13:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:28.391751 | orchestrator | 2026-04-07 02:13:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:28.391831 | orchestrator | 2026-04-07 02:13:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:31.439956 | orchestrator | 2026-04-07 02:13:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:31.442295 | orchestrator | 2026-04-07 02:13:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:31.442366 | orchestrator | 2026-04-07 02:13:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:34.494510 | orchestrator | 2026-04-07 02:13:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:34.496293 | orchestrator | 2026-04-07 02:13:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:34.496353 | orchestrator | 2026-04-07 02:13:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:37.545741 | orchestrator | 2026-04-07 02:13:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:37.546419 | orchestrator | 2026-04-07 02:13:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:37.546460 | orchestrator | 2026-04-07 02:13:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:40.591878 | orchestrator | 2026-04-07 02:13:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:40.593246 | orchestrator | 2026-04-07 02:13:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:40.593300 | orchestrator | 2026-04-07 02:13:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:43.640726 | orchestrator | 2026-04-07 02:13:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:43.642937 | orchestrator | 2026-04-07 02:13:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:43.643006 | orchestrator | 2026-04-07 02:13:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:46.691598 | orchestrator | 2026-04-07 02:13:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:46.694225 | orchestrator | 2026-04-07 02:13:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:46.694321 | orchestrator | 2026-04-07 02:13:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:49.750580 | orchestrator | 2026-04-07 02:13:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:49.753297 | orchestrator | 2026-04-07 02:13:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:49.753358 | orchestrator | 2026-04-07 02:13:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:52.806737 | orchestrator | 2026-04-07 02:13:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:52.807805 | orchestrator | 2026-04-07 02:13:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:52.807844 | orchestrator | 2026-04-07 02:13:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:55.850462 | orchestrator | 2026-04-07 02:13:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:55.851166 | orchestrator | 2026-04-07 02:13:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:55.851217 | orchestrator | 2026-04-07 02:13:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:13:58.900525 | orchestrator | 2026-04-07 02:13:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:13:58.902759 | orchestrator | 2026-04-07 02:13:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:13:58.902837 | orchestrator | 2026-04-07 02:13:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:01.955592 | orchestrator | 2026-04-07 02:14:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:01.956967 | orchestrator | 2026-04-07 02:14:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:01.957163 | orchestrator | 2026-04-07 02:14:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:05.008913 | orchestrator | 2026-04-07 02:14:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:05.010579 | orchestrator | 2026-04-07 02:14:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:05.010642 | orchestrator | 2026-04-07 02:14:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:08.072458 | orchestrator | 2026-04-07 02:14:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:08.073738 | orchestrator | 2026-04-07 02:14:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:08.073836 | orchestrator | 2026-04-07 02:14:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:11.116730 | orchestrator | 2026-04-07 02:14:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:11.119385 | orchestrator | 2026-04-07 02:14:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:11.119531 | orchestrator | 2026-04-07 02:14:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:14.167634 | orchestrator | 2026-04-07 02:14:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:14.168940 | orchestrator | 2026-04-07 02:14:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:14.168986 | orchestrator | 2026-04-07 02:14:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:17.210273 | orchestrator | 2026-04-07 02:14:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:17.211867 | orchestrator | 2026-04-07 02:14:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:17.212221 | orchestrator | 2026-04-07 02:14:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:20.263447 | orchestrator | 2026-04-07 02:14:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:20.264448 | orchestrator | 2026-04-07 02:14:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:20.264501 | orchestrator | 2026-04-07 02:14:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:23.323972 | orchestrator | 2026-04-07 02:14:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:23.328719 | orchestrator | 2026-04-07 02:14:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:23.328793 | orchestrator | 2026-04-07 02:14:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:26.381219 | orchestrator | 2026-04-07 02:14:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:26.382293 | orchestrator | 2026-04-07 02:14:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:26.382341 | orchestrator | 2026-04-07 02:14:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:29.427786 | orchestrator | 2026-04-07 02:14:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:29.429450 | orchestrator | 2026-04-07 02:14:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:29.430169 | orchestrator | 2026-04-07 02:14:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:32.484188 | orchestrator | 2026-04-07 02:14:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:32.485745 | orchestrator | 2026-04-07 02:14:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:32.485799 | orchestrator | 2026-04-07 02:14:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:35.535850 | orchestrator | 2026-04-07 02:14:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:35.537791 | orchestrator | 2026-04-07 02:14:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:35.537947 | orchestrator | 2026-04-07 02:14:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:38.585235 | orchestrator | 2026-04-07 02:14:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:38.587647 | orchestrator | 2026-04-07 02:14:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:38.587735 | orchestrator | 2026-04-07 02:14:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:41.642932 | orchestrator | 2026-04-07 02:14:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:41.644210 | orchestrator | 2026-04-07 02:14:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:41.644279 | orchestrator | 2026-04-07 02:14:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:44.688346 | orchestrator | 2026-04-07 02:14:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:44.689726 | orchestrator | 2026-04-07 02:14:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:44.690334 | orchestrator | 2026-04-07 02:14:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:47.740825 | orchestrator | 2026-04-07 02:14:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:47.743332 | orchestrator | 2026-04-07 02:14:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:47.743411 | orchestrator | 2026-04-07 02:14:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:50.793334 | orchestrator | 2026-04-07 02:14:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:50.796774 | orchestrator | 2026-04-07 02:14:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:50.796889 | orchestrator | 2026-04-07 02:14:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:53.845328 | orchestrator | 2026-04-07 02:14:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:53.847390 | orchestrator | 2026-04-07 02:14:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:53.847491 | orchestrator | 2026-04-07 02:14:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:56.900214 | orchestrator | 2026-04-07 02:14:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:56.901961 | orchestrator | 2026-04-07 02:14:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:56.902075 | orchestrator | 2026-04-07 02:14:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:14:59.954114 | orchestrator | 2026-04-07 02:14:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:14:59.955446 | orchestrator | 2026-04-07 02:14:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:14:59.955506 | orchestrator | 2026-04-07 02:14:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:03.003427 | orchestrator | 2026-04-07 02:15:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:03.005885 | orchestrator | 2026-04-07 02:15:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:03.005951 | orchestrator | 2026-04-07 02:15:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:06.059108 | orchestrator | 2026-04-07 02:15:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:06.061419 | orchestrator | 2026-04-07 02:15:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:06.061634 | orchestrator | 2026-04-07 02:15:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:09.116303 | orchestrator | 2026-04-07 02:15:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:09.118368 | orchestrator | 2026-04-07 02:15:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:09.118443 | orchestrator | 2026-04-07 02:15:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:12.169727 | orchestrator | 2026-04-07 02:15:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:12.172176 | orchestrator | 2026-04-07 02:15:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:12.172235 | orchestrator | 2026-04-07 02:15:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:15.223492 | orchestrator | 2026-04-07 02:15:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:15.225307 | orchestrator | 2026-04-07 02:15:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:15.225370 | orchestrator | 2026-04-07 02:15:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:18.279821 | orchestrator | 2026-04-07 02:15:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:18.281818 | orchestrator | 2026-04-07 02:15:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:18.282164 | orchestrator | 2026-04-07 02:15:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:21.330597 | orchestrator | 2026-04-07 02:15:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:21.331731 | orchestrator | 2026-04-07 02:15:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:21.331799 | orchestrator | 2026-04-07 02:15:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:24.378269 | orchestrator | 2026-04-07 02:15:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:24.380239 | orchestrator | 2026-04-07 02:15:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:24.380288 | orchestrator | 2026-04-07 02:15:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:27.423476 | orchestrator | 2026-04-07 02:15:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:27.425060 | orchestrator | 2026-04-07 02:15:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:27.425132 | orchestrator | 2026-04-07 02:15:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:30.477647 | orchestrator | 2026-04-07 02:15:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:30.479641 | orchestrator | 2026-04-07 02:15:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:30.479683 | orchestrator | 2026-04-07 02:15:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:33.534587 | orchestrator | 2026-04-07 02:15:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:33.536672 | orchestrator | 2026-04-07 02:15:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:33.536728 | orchestrator | 2026-04-07 02:15:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:36.587628 | orchestrator | 2026-04-07 02:15:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:36.589985 | orchestrator | 2026-04-07 02:15:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:36.590139 | orchestrator | 2026-04-07 02:15:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:39.644954 | orchestrator | 2026-04-07 02:15:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:39.647504 | orchestrator | 2026-04-07 02:15:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:39.647541 | orchestrator | 2026-04-07 02:15:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:42.699818 | orchestrator | 2026-04-07 02:15:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:42.701655 | orchestrator | 2026-04-07 02:15:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:42.701688 | orchestrator | 2026-04-07 02:15:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:45.749735 | orchestrator | 2026-04-07 02:15:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:45.751862 | orchestrator | 2026-04-07 02:15:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:45.751971 | orchestrator | 2026-04-07 02:15:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:48.804053 | orchestrator | 2026-04-07 02:15:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:48.805595 | orchestrator | 2026-04-07 02:15:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:48.805641 | orchestrator | 2026-04-07 02:15:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:51.863178 | orchestrator | 2026-04-07 02:15:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:51.864639 | orchestrator | 2026-04-07 02:15:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:51.864661 | orchestrator | 2026-04-07 02:15:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:54.917554 | orchestrator | 2026-04-07 02:15:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:54.922102 | orchestrator | 2026-04-07 02:15:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:54.922222 | orchestrator | 2026-04-07 02:15:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:15:57.979138 | orchestrator | 2026-04-07 02:15:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:15:57.980948 | orchestrator | 2026-04-07 02:15:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:15:57.981004 | orchestrator | 2026-04-07 02:15:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:01.033127 | orchestrator | 2026-04-07 02:16:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:01.036138 | orchestrator | 2026-04-07 02:16:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:01.036605 | orchestrator | 2026-04-07 02:16:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:04.088117 | orchestrator | 2026-04-07 02:16:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:04.089760 | orchestrator | 2026-04-07 02:16:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:04.089848 | orchestrator | 2026-04-07 02:16:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:07.141374 | orchestrator | 2026-04-07 02:16:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:07.144052 | orchestrator | 2026-04-07 02:16:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:07.144306 | orchestrator | 2026-04-07 02:16:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:10.194902 | orchestrator | 2026-04-07 02:16:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:10.196499 | orchestrator | 2026-04-07 02:16:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:10.196535 | orchestrator | 2026-04-07 02:16:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:13.252151 | orchestrator | 2026-04-07 02:16:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:13.253655 | orchestrator | 2026-04-07 02:16:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:13.253700 | orchestrator | 2026-04-07 02:16:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:16.299609 | orchestrator | 2026-04-07 02:16:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:16.302297 | orchestrator | 2026-04-07 02:16:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:16.302359 | orchestrator | 2026-04-07 02:16:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:19.358694 | orchestrator | 2026-04-07 02:16:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:19.360684 | orchestrator | 2026-04-07 02:16:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:19.360739 | orchestrator | 2026-04-07 02:16:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:22.426483 | orchestrator | 2026-04-07 02:16:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:22.428407 | orchestrator | 2026-04-07 02:16:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:22.428458 | orchestrator | 2026-04-07 02:16:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:25.477118 | orchestrator | 2026-04-07 02:16:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:25.479496 | orchestrator | 2026-04-07 02:16:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:25.479539 | orchestrator | 2026-04-07 02:16:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:28.534275 | orchestrator | 2026-04-07 02:16:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:28.536005 | orchestrator | 2026-04-07 02:16:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:28.536052 | orchestrator | 2026-04-07 02:16:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:31.583826 | orchestrator | 2026-04-07 02:16:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:31.585981 | orchestrator | 2026-04-07 02:16:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:31.586138 | orchestrator | 2026-04-07 02:16:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:34.636615 | orchestrator | 2026-04-07 02:16:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:34.638677 | orchestrator | 2026-04-07 02:16:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:34.639408 | orchestrator | 2026-04-07 02:16:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:37.695611 | orchestrator | 2026-04-07 02:16:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:37.698141 | orchestrator | 2026-04-07 02:16:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:37.698259 | orchestrator | 2026-04-07 02:16:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:40.751850 | orchestrator | 2026-04-07 02:16:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:40.754073 | orchestrator | 2026-04-07 02:16:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:40.754145 | orchestrator | 2026-04-07 02:16:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:43.813808 | orchestrator | 2026-04-07 02:16:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:43.814807 | orchestrator | 2026-04-07 02:16:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:43.814871 | orchestrator | 2026-04-07 02:16:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:46.856736 | orchestrator | 2026-04-07 02:16:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:46.859574 | orchestrator | 2026-04-07 02:16:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:46.859648 | orchestrator | 2026-04-07 02:16:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:49.911003 | orchestrator | 2026-04-07 02:16:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:49.912463 | orchestrator | 2026-04-07 02:16:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:49.912619 | orchestrator | 2026-04-07 02:16:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:52.955693 | orchestrator | 2026-04-07 02:16:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:52.957144 | orchestrator | 2026-04-07 02:16:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:52.957220 | orchestrator | 2026-04-07 02:16:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:56.007211 | orchestrator | 2026-04-07 02:16:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:56.011891 | orchestrator | 2026-04-07 02:16:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:56.011942 | orchestrator | 2026-04-07 02:16:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:16:59.055121 | orchestrator | 2026-04-07 02:16:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:16:59.055514 | orchestrator | 2026-04-07 02:16:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:16:59.056057 | orchestrator | 2026-04-07 02:16:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:02.098300 | orchestrator | 2026-04-07 02:17:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:02.100122 | orchestrator | 2026-04-07 02:17:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:02.100237 | orchestrator | 2026-04-07 02:17:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:05.140795 | orchestrator | 2026-04-07 02:17:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:05.140939 | orchestrator | 2026-04-07 02:17:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:05.140950 | orchestrator | 2026-04-07 02:17:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:08.185517 | orchestrator | 2026-04-07 02:17:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:08.188081 | orchestrator | 2026-04-07 02:17:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:08.188195 | orchestrator | 2026-04-07 02:17:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:11.235071 | orchestrator | 2026-04-07 02:17:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:11.236506 | orchestrator | 2026-04-07 02:17:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:11.236583 | orchestrator | 2026-04-07 02:17:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:14.285511 | orchestrator | 2026-04-07 02:17:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:14.287349 | orchestrator | 2026-04-07 02:17:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:14.287444 | orchestrator | 2026-04-07 02:17:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:17.335534 | orchestrator | 2026-04-07 02:17:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:17.337542 | orchestrator | 2026-04-07 02:17:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:17.337622 | orchestrator | 2026-04-07 02:17:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:20.388085 | orchestrator | 2026-04-07 02:17:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:20.390504 | orchestrator | 2026-04-07 02:17:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:20.390550 | orchestrator | 2026-04-07 02:17:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:23.437229 | orchestrator | 2026-04-07 02:17:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:23.438162 | orchestrator | 2026-04-07 02:17:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:23.438247 | orchestrator | 2026-04-07 02:17:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:26.489095 | orchestrator | 2026-04-07 02:17:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:26.490083 | orchestrator | 2026-04-07 02:17:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:26.490130 | orchestrator | 2026-04-07 02:17:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:29.543640 | orchestrator | 2026-04-07 02:17:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:29.545276 | orchestrator | 2026-04-07 02:17:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:29.545425 | orchestrator | 2026-04-07 02:17:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:32.596009 | orchestrator | 2026-04-07 02:17:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:32.598315 | orchestrator | 2026-04-07 02:17:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:32.598391 | orchestrator | 2026-04-07 02:17:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:35.648745 | orchestrator | 2026-04-07 02:17:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:35.650963 | orchestrator | 2026-04-07 02:17:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:35.651102 | orchestrator | 2026-04-07 02:17:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:38.701515 | orchestrator | 2026-04-07 02:17:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:38.703516 | orchestrator | 2026-04-07 02:17:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:38.703571 | orchestrator | 2026-04-07 02:17:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:41.757949 | orchestrator | 2026-04-07 02:17:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:41.760809 | orchestrator | 2026-04-07 02:17:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:41.760861 | orchestrator | 2026-04-07 02:17:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:44.813448 | orchestrator | 2026-04-07 02:17:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:44.814513 | orchestrator | 2026-04-07 02:17:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:44.814555 | orchestrator | 2026-04-07 02:17:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:47.857517 | orchestrator | 2026-04-07 02:17:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:47.859366 | orchestrator | 2026-04-07 02:17:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:47.859422 | orchestrator | 2026-04-07 02:17:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:50.906224 | orchestrator | 2026-04-07 02:17:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:50.907223 | orchestrator | 2026-04-07 02:17:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:50.907271 | orchestrator | 2026-04-07 02:17:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:53.958125 | orchestrator | 2026-04-07 02:17:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:53.960438 | orchestrator | 2026-04-07 02:17:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:53.960513 | orchestrator | 2026-04-07 02:17:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:17:57.007451 | orchestrator | 2026-04-07 02:17:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:17:57.008133 | orchestrator | 2026-04-07 02:17:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:17:57.008208 | orchestrator | 2026-04-07 02:17:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:00.052927 | orchestrator | 2026-04-07 02:18:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:00.054809 | orchestrator | 2026-04-07 02:18:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:00.054865 | orchestrator | 2026-04-07 02:18:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:03.104258 | orchestrator | 2026-04-07 02:18:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:03.106277 | orchestrator | 2026-04-07 02:18:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:03.106349 | orchestrator | 2026-04-07 02:18:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:06.156824 | orchestrator | 2026-04-07 02:18:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:06.157386 | orchestrator | 2026-04-07 02:18:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:06.157415 | orchestrator | 2026-04-07 02:18:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:09.214816 | orchestrator | 2026-04-07 02:18:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:09.217305 | orchestrator | 2026-04-07 02:18:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:09.217396 | orchestrator | 2026-04-07 02:18:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:12.269727 | orchestrator | 2026-04-07 02:18:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:12.271539 | orchestrator | 2026-04-07 02:18:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:12.271615 | orchestrator | 2026-04-07 02:18:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:15.328325 | orchestrator | 2026-04-07 02:18:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:15.332740 | orchestrator | 2026-04-07 02:18:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:15.332833 | orchestrator | 2026-04-07 02:18:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:18.386979 | orchestrator | 2026-04-07 02:18:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:18.388233 | orchestrator | 2026-04-07 02:18:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:18.388346 | orchestrator | 2026-04-07 02:18:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:21.445315 | orchestrator | 2026-04-07 02:18:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:21.446595 | orchestrator | 2026-04-07 02:18:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:21.446649 | orchestrator | 2026-04-07 02:18:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:24.496617 | orchestrator | 2026-04-07 02:18:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:24.498464 | orchestrator | 2026-04-07 02:18:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:24.498513 | orchestrator | 2026-04-07 02:18:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:27.548012 | orchestrator | 2026-04-07 02:18:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:27.548559 | orchestrator | 2026-04-07 02:18:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:27.548590 | orchestrator | 2026-04-07 02:18:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:30.599837 | orchestrator | 2026-04-07 02:18:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:30.601338 | orchestrator | 2026-04-07 02:18:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:30.601376 | orchestrator | 2026-04-07 02:18:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:33.648624 | orchestrator | 2026-04-07 02:18:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:33.651366 | orchestrator | 2026-04-07 02:18:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:33.651421 | orchestrator | 2026-04-07 02:18:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:36.708877 | orchestrator | 2026-04-07 02:18:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:36.710512 | orchestrator | 2026-04-07 02:18:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:36.710846 | orchestrator | 2026-04-07 02:18:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:39.752175 | orchestrator | 2026-04-07 02:18:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:39.753507 | orchestrator | 2026-04-07 02:18:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:39.753544 | orchestrator | 2026-04-07 02:18:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:42.800253 | orchestrator | 2026-04-07 02:18:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:42.801551 | orchestrator | 2026-04-07 02:18:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:42.801628 | orchestrator | 2026-04-07 02:18:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:45.851970 | orchestrator | 2026-04-07 02:18:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:45.854211 | orchestrator | 2026-04-07 02:18:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:45.854271 | orchestrator | 2026-04-07 02:18:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:48.906884 | orchestrator | 2026-04-07 02:18:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:48.909849 | orchestrator | 2026-04-07 02:18:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:48.909886 | orchestrator | 2026-04-07 02:18:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:51.963460 | orchestrator | 2026-04-07 02:18:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:51.964395 | orchestrator | 2026-04-07 02:18:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:51.964420 | orchestrator | 2026-04-07 02:18:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:55.014949 | orchestrator | 2026-04-07 02:18:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:55.016649 | orchestrator | 2026-04-07 02:18:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:55.016697 | orchestrator | 2026-04-07 02:18:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:18:58.066754 | orchestrator | 2026-04-07 02:18:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:18:58.067948 | orchestrator | 2026-04-07 02:18:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:18:58.067991 | orchestrator | 2026-04-07 02:18:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:01.119117 | orchestrator | 2026-04-07 02:19:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:01.121373 | orchestrator | 2026-04-07 02:19:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:01.121462 | orchestrator | 2026-04-07 02:19:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:04.175544 | orchestrator | 2026-04-07 02:19:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:04.176530 | orchestrator | 2026-04-07 02:19:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:04.176579 | orchestrator | 2026-04-07 02:19:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:07.233416 | orchestrator | 2026-04-07 02:19:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:07.236103 | orchestrator | 2026-04-07 02:19:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:07.236365 | orchestrator | 2026-04-07 02:19:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:10.288883 | orchestrator | 2026-04-07 02:19:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:10.290420 | orchestrator | 2026-04-07 02:19:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:10.290636 | orchestrator | 2026-04-07 02:19:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:13.342244 | orchestrator | 2026-04-07 02:19:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:13.344870 | orchestrator | 2026-04-07 02:19:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:13.345052 | orchestrator | 2026-04-07 02:19:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:16.399705 | orchestrator | 2026-04-07 02:19:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:16.400791 | orchestrator | 2026-04-07 02:19:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:16.400830 | orchestrator | 2026-04-07 02:19:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:19.453939 | orchestrator | 2026-04-07 02:19:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:19.456300 | orchestrator | 2026-04-07 02:19:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:19.456354 | orchestrator | 2026-04-07 02:19:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:22.506767 | orchestrator | 2026-04-07 02:19:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:22.508326 | orchestrator | 2026-04-07 02:19:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:22.508491 | orchestrator | 2026-04-07 02:19:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:25.558093 | orchestrator | 2026-04-07 02:19:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:25.559814 | orchestrator | 2026-04-07 02:19:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:25.559862 | orchestrator | 2026-04-07 02:19:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:28.608377 | orchestrator | 2026-04-07 02:19:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:28.609686 | orchestrator | 2026-04-07 02:19:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:28.609755 | orchestrator | 2026-04-07 02:19:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:31.660421 | orchestrator | 2026-04-07 02:19:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:31.662133 | orchestrator | 2026-04-07 02:19:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:31.662216 | orchestrator | 2026-04-07 02:19:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:34.712757 | orchestrator | 2026-04-07 02:19:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:34.714263 | orchestrator | 2026-04-07 02:19:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:34.714625 | orchestrator | 2026-04-07 02:19:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:37.762737 | orchestrator | 2026-04-07 02:19:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:37.765737 | orchestrator | 2026-04-07 02:19:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:37.765783 | orchestrator | 2026-04-07 02:19:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:40.808720 | orchestrator | 2026-04-07 02:19:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:40.810843 | orchestrator | 2026-04-07 02:19:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:40.811095 | orchestrator | 2026-04-07 02:19:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:43.862488 | orchestrator | 2026-04-07 02:19:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:43.863619 | orchestrator | 2026-04-07 02:19:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:43.863995 | orchestrator | 2026-04-07 02:19:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:46.916958 | orchestrator | 2026-04-07 02:19:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:46.918901 | orchestrator | 2026-04-07 02:19:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:46.919373 | orchestrator | 2026-04-07 02:19:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:49.978617 | orchestrator | 2026-04-07 02:19:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:49.981479 | orchestrator | 2026-04-07 02:19:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:49.981530 | orchestrator | 2026-04-07 02:19:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:53.031035 | orchestrator | 2026-04-07 02:19:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:53.032761 | orchestrator | 2026-04-07 02:19:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:53.032806 | orchestrator | 2026-04-07 02:19:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:56.084114 | orchestrator | 2026-04-07 02:19:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:56.085492 | orchestrator | 2026-04-07 02:19:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:56.085519 | orchestrator | 2026-04-07 02:19:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:19:59.143944 | orchestrator | 2026-04-07 02:19:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:19:59.146311 | orchestrator | 2026-04-07 02:19:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:19:59.146367 | orchestrator | 2026-04-07 02:19:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:02.204132 | orchestrator | 2026-04-07 02:20:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:02.206596 | orchestrator | 2026-04-07 02:20:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:02.206684 | orchestrator | 2026-04-07 02:20:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:05.260581 | orchestrator | 2026-04-07 02:20:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:05.262070 | orchestrator | 2026-04-07 02:20:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:05.262140 | orchestrator | 2026-04-07 02:20:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:08.315402 | orchestrator | 2026-04-07 02:20:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:08.316396 | orchestrator | 2026-04-07 02:20:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:08.316427 | orchestrator | 2026-04-07 02:20:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:11.374714 | orchestrator | 2026-04-07 02:20:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:11.375853 | orchestrator | 2026-04-07 02:20:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:11.376146 | orchestrator | 2026-04-07 02:20:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:14.429401 | orchestrator | 2026-04-07 02:20:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:14.431354 | orchestrator | 2026-04-07 02:20:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:14.431770 | orchestrator | 2026-04-07 02:20:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:17.483474 | orchestrator | 2026-04-07 02:20:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:17.487387 | orchestrator | 2026-04-07 02:20:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:17.487473 | orchestrator | 2026-04-07 02:20:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:20.544673 | orchestrator | 2026-04-07 02:20:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:20.546577 | orchestrator | 2026-04-07 02:20:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:20.546622 | orchestrator | 2026-04-07 02:20:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:23.598124 | orchestrator | 2026-04-07 02:20:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:23.600083 | orchestrator | 2026-04-07 02:20:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:23.600305 | orchestrator | 2026-04-07 02:20:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:26.656291 | orchestrator | 2026-04-07 02:20:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:26.659382 | orchestrator | 2026-04-07 02:20:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:26.659444 | orchestrator | 2026-04-07 02:20:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:29.711915 | orchestrator | 2026-04-07 02:20:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:29.713760 | orchestrator | 2026-04-07 02:20:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:29.713852 | orchestrator | 2026-04-07 02:20:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:32.764935 | orchestrator | 2026-04-07 02:20:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:32.766180 | orchestrator | 2026-04-07 02:20:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:32.766320 | orchestrator | 2026-04-07 02:20:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:35.821557 | orchestrator | 2026-04-07 02:20:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:35.823174 | orchestrator | 2026-04-07 02:20:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:35.823396 | orchestrator | 2026-04-07 02:20:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:38.866794 | orchestrator | 2026-04-07 02:20:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:38.867957 | orchestrator | 2026-04-07 02:20:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:38.868013 | orchestrator | 2026-04-07 02:20:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:41.912419 | orchestrator | 2026-04-07 02:20:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:41.914928 | orchestrator | 2026-04-07 02:20:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:41.915001 | orchestrator | 2026-04-07 02:20:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:44.966435 | orchestrator | 2026-04-07 02:20:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:44.967989 | orchestrator | 2026-04-07 02:20:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:44.968051 | orchestrator | 2026-04-07 02:20:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:48.020768 | orchestrator | 2026-04-07 02:20:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:48.022420 | orchestrator | 2026-04-07 02:20:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:48.022502 | orchestrator | 2026-04-07 02:20:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:51.078341 | orchestrator | 2026-04-07 02:20:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:51.079631 | orchestrator | 2026-04-07 02:20:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:51.079724 | orchestrator | 2026-04-07 02:20:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:54.126530 | orchestrator | 2026-04-07 02:20:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:54.127162 | orchestrator | 2026-04-07 02:20:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:54.127320 | orchestrator | 2026-04-07 02:20:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:20:57.172507 | orchestrator | 2026-04-07 02:20:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:20:57.174702 | orchestrator | 2026-04-07 02:20:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:20:57.174850 | orchestrator | 2026-04-07 02:20:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:00.222224 | orchestrator | 2026-04-07 02:21:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:00.224863 | orchestrator | 2026-04-07 02:21:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:00.224883 | orchestrator | 2026-04-07 02:21:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:03.274476 | orchestrator | 2026-04-07 02:21:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:03.276099 | orchestrator | 2026-04-07 02:21:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:03.276130 | orchestrator | 2026-04-07 02:21:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:06.325957 | orchestrator | 2026-04-07 02:21:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:06.328374 | orchestrator | 2026-04-07 02:21:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:06.328554 | orchestrator | 2026-04-07 02:21:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:09.370795 | orchestrator | 2026-04-07 02:21:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:09.372254 | orchestrator | 2026-04-07 02:21:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:09.372293 | orchestrator | 2026-04-07 02:21:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:12.414983 | orchestrator | 2026-04-07 02:21:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:12.417479 | orchestrator | 2026-04-07 02:21:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:12.417563 | orchestrator | 2026-04-07 02:21:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:15.462882 | orchestrator | 2026-04-07 02:21:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:15.465625 | orchestrator | 2026-04-07 02:21:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:15.465713 | orchestrator | 2026-04-07 02:21:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:18.529825 | orchestrator | 2026-04-07 02:21:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:18.531277 | orchestrator | 2026-04-07 02:21:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:18.531325 | orchestrator | 2026-04-07 02:21:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:21.587517 | orchestrator | 2026-04-07 02:21:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:21.588819 | orchestrator | 2026-04-07 02:21:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:21.588892 | orchestrator | 2026-04-07 02:21:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:24.646597 | orchestrator | 2026-04-07 02:21:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:24.649120 | orchestrator | 2026-04-07 02:21:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:24.649294 | orchestrator | 2026-04-07 02:21:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:27.701543 | orchestrator | 2026-04-07 02:21:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:27.703054 | orchestrator | 2026-04-07 02:21:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:27.703204 | orchestrator | 2026-04-07 02:21:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:30.755327 | orchestrator | 2026-04-07 02:21:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:30.756955 | orchestrator | 2026-04-07 02:21:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:30.757077 | orchestrator | 2026-04-07 02:21:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:33.808403 | orchestrator | 2026-04-07 02:21:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:33.811968 | orchestrator | 2026-04-07 02:21:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:33.812044 | orchestrator | 2026-04-07 02:21:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:36.869694 | orchestrator | 2026-04-07 02:21:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:36.872536 | orchestrator | 2026-04-07 02:21:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:36.872607 | orchestrator | 2026-04-07 02:21:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:39.923328 | orchestrator | 2026-04-07 02:21:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:39.924784 | orchestrator | 2026-04-07 02:21:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:39.924830 | orchestrator | 2026-04-07 02:21:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:42.975082 | orchestrator | 2026-04-07 02:21:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:42.977326 | orchestrator | 2026-04-07 02:21:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:42.977395 | orchestrator | 2026-04-07 02:21:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:46.031491 | orchestrator | 2026-04-07 02:21:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:46.032797 | orchestrator | 2026-04-07 02:21:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:46.032848 | orchestrator | 2026-04-07 02:21:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:49.074390 | orchestrator | 2026-04-07 02:21:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:49.077743 | orchestrator | 2026-04-07 02:21:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:49.077822 | orchestrator | 2026-04-07 02:21:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:52.133072 | orchestrator | 2026-04-07 02:21:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:52.135079 | orchestrator | 2026-04-07 02:21:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:52.135135 | orchestrator | 2026-04-07 02:21:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:55.180714 | orchestrator | 2026-04-07 02:21:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:55.182045 | orchestrator | 2026-04-07 02:21:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:55.182082 | orchestrator | 2026-04-07 02:21:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:21:58.233741 | orchestrator | 2026-04-07 02:21:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:21:58.234824 | orchestrator | 2026-04-07 02:21:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:21:58.235180 | orchestrator | 2026-04-07 02:21:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:01.287109 | orchestrator | 2026-04-07 02:22:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:01.289585 | orchestrator | 2026-04-07 02:22:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:01.289667 | orchestrator | 2026-04-07 02:22:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:04.344857 | orchestrator | 2026-04-07 02:22:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:04.346788 | orchestrator | 2026-04-07 02:22:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:04.346834 | orchestrator | 2026-04-07 02:22:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:07.399785 | orchestrator | 2026-04-07 02:22:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:07.401176 | orchestrator | 2026-04-07 02:22:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:07.401250 | orchestrator | 2026-04-07 02:22:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:10.445395 | orchestrator | 2026-04-07 02:22:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:10.446432 | orchestrator | 2026-04-07 02:22:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:10.446484 | orchestrator | 2026-04-07 02:22:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:13.501603 | orchestrator | 2026-04-07 02:22:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:13.503082 | orchestrator | 2026-04-07 02:22:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:13.503147 | orchestrator | 2026-04-07 02:22:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:16.553489 | orchestrator | 2026-04-07 02:22:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:16.556197 | orchestrator | 2026-04-07 02:22:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:16.556385 | orchestrator | 2026-04-07 02:22:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:19.609880 | orchestrator | 2026-04-07 02:22:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:19.612010 | orchestrator | 2026-04-07 02:22:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:19.612057 | orchestrator | 2026-04-07 02:22:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:22.663324 | orchestrator | 2026-04-07 02:22:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:22.666277 | orchestrator | 2026-04-07 02:22:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:22.666359 | orchestrator | 2026-04-07 02:22:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:25.713530 | orchestrator | 2026-04-07 02:22:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:25.716043 | orchestrator | 2026-04-07 02:22:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:25.716101 | orchestrator | 2026-04-07 02:22:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:28.763993 | orchestrator | 2026-04-07 02:22:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:28.766303 | orchestrator | 2026-04-07 02:22:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:28.766431 | orchestrator | 2026-04-07 02:22:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:31.809334 | orchestrator | 2026-04-07 02:22:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:31.812307 | orchestrator | 2026-04-07 02:22:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:31.812367 | orchestrator | 2026-04-07 02:22:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:34.854995 | orchestrator | 2026-04-07 02:22:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:34.856901 | orchestrator | 2026-04-07 02:22:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:34.857185 | orchestrator | 2026-04-07 02:22:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:37.914545 | orchestrator | 2026-04-07 02:22:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:37.916750 | orchestrator | 2026-04-07 02:22:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:37.916799 | orchestrator | 2026-04-07 02:22:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:40.964032 | orchestrator | 2026-04-07 02:22:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:40.965468 | orchestrator | 2026-04-07 02:22:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:40.965547 | orchestrator | 2026-04-07 02:22:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:44.018072 | orchestrator | 2026-04-07 02:22:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:44.020834 | orchestrator | 2026-04-07 02:22:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:44.020950 | orchestrator | 2026-04-07 02:22:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:47.061371 | orchestrator | 2026-04-07 02:22:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:47.063702 | orchestrator | 2026-04-07 02:22:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:47.063748 | orchestrator | 2026-04-07 02:22:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:50.108968 | orchestrator | 2026-04-07 02:22:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:50.110979 | orchestrator | 2026-04-07 02:22:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:50.111100 | orchestrator | 2026-04-07 02:22:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:53.162324 | orchestrator | 2026-04-07 02:22:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:53.173495 | orchestrator | 2026-04-07 02:22:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:53.173582 | orchestrator | 2026-04-07 02:22:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:56.212582 | orchestrator | 2026-04-07 02:22:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:56.218541 | orchestrator | 2026-04-07 02:22:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:56.218577 | orchestrator | 2026-04-07 02:22:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:22:59.268589 | orchestrator | 2026-04-07 02:22:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:22:59.270100 | orchestrator | 2026-04-07 02:22:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:22:59.270147 | orchestrator | 2026-04-07 02:22:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:02.314562 | orchestrator | 2026-04-07 02:23:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:02.317294 | orchestrator | 2026-04-07 02:23:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:02.317358 | orchestrator | 2026-04-07 02:23:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:05.366338 | orchestrator | 2026-04-07 02:23:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:05.367715 | orchestrator | 2026-04-07 02:23:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:05.367788 | orchestrator | 2026-04-07 02:23:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:08.423534 | orchestrator | 2026-04-07 02:23:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:08.425326 | orchestrator | 2026-04-07 02:23:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:08.425361 | orchestrator | 2026-04-07 02:23:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:11.477384 | orchestrator | 2026-04-07 02:23:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:11.478975 | orchestrator | 2026-04-07 02:23:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:11.479022 | orchestrator | 2026-04-07 02:23:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:14.523992 | orchestrator | 2026-04-07 02:23:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:14.526389 | orchestrator | 2026-04-07 02:23:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:14.526452 | orchestrator | 2026-04-07 02:23:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:17.571781 | orchestrator | 2026-04-07 02:23:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:17.572972 | orchestrator | 2026-04-07 02:23:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:17.573006 | orchestrator | 2026-04-07 02:23:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:20.624378 | orchestrator | 2026-04-07 02:23:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:20.625282 | orchestrator | 2026-04-07 02:23:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:20.625318 | orchestrator | 2026-04-07 02:23:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:23.674113 | orchestrator | 2026-04-07 02:23:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:23.675953 | orchestrator | 2026-04-07 02:23:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:23.676028 | orchestrator | 2026-04-07 02:23:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:26.718402 | orchestrator | 2026-04-07 02:23:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:26.720273 | orchestrator | 2026-04-07 02:23:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:26.720415 | orchestrator | 2026-04-07 02:23:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:29.772881 | orchestrator | 2026-04-07 02:23:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:29.775080 | orchestrator | 2026-04-07 02:23:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:29.775361 | orchestrator | 2026-04-07 02:23:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:32.827397 | orchestrator | 2026-04-07 02:23:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:32.828718 | orchestrator | 2026-04-07 02:23:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:32.829147 | orchestrator | 2026-04-07 02:23:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:35.878058 | orchestrator | 2026-04-07 02:23:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:35.881817 | orchestrator | 2026-04-07 02:23:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:35.881902 | orchestrator | 2026-04-07 02:23:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:38.940889 | orchestrator | 2026-04-07 02:23:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:38.944005 | orchestrator | 2026-04-07 02:23:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:38.944081 | orchestrator | 2026-04-07 02:23:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:42.001705 | orchestrator | 2026-04-07 02:23:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:42.003264 | orchestrator | 2026-04-07 02:23:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:42.003415 | orchestrator | 2026-04-07 02:23:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:45.055983 | orchestrator | 2026-04-07 02:23:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:45.057823 | orchestrator | 2026-04-07 02:23:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:45.057954 | orchestrator | 2026-04-07 02:23:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:48.108585 | orchestrator | 2026-04-07 02:23:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:48.111025 | orchestrator | 2026-04-07 02:23:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:48.111098 | orchestrator | 2026-04-07 02:23:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:51.165522 | orchestrator | 2026-04-07 02:23:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:51.168279 | orchestrator | 2026-04-07 02:23:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:51.168310 | orchestrator | 2026-04-07 02:23:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:54.212422 | orchestrator | 2026-04-07 02:23:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:54.214820 | orchestrator | 2026-04-07 02:23:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:54.214924 | orchestrator | 2026-04-07 02:23:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:23:57.269483 | orchestrator | 2026-04-07 02:23:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:23:57.270792 | orchestrator | 2026-04-07 02:23:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:23:57.270873 | orchestrator | 2026-04-07 02:23:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:00.314104 | orchestrator | 2026-04-07 02:24:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:00.315697 | orchestrator | 2026-04-07 02:24:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:00.315798 | orchestrator | 2026-04-07 02:24:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:03.366461 | orchestrator | 2026-04-07 02:24:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:03.368293 | orchestrator | 2026-04-07 02:24:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:03.368352 | orchestrator | 2026-04-07 02:24:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:06.416953 | orchestrator | 2026-04-07 02:24:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:06.419550 | orchestrator | 2026-04-07 02:24:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:06.419624 | orchestrator | 2026-04-07 02:24:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:09.468823 | orchestrator | 2026-04-07 02:24:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:09.470194 | orchestrator | 2026-04-07 02:24:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:09.470405 | orchestrator | 2026-04-07 02:24:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:12.525191 | orchestrator | 2026-04-07 02:24:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:12.527274 | orchestrator | 2026-04-07 02:24:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:12.527323 | orchestrator | 2026-04-07 02:24:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:15.573654 | orchestrator | 2026-04-07 02:24:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:15.575376 | orchestrator | 2026-04-07 02:24:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:15.575469 | orchestrator | 2026-04-07 02:24:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:18.616849 | orchestrator | 2026-04-07 02:24:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:18.618073 | orchestrator | 2026-04-07 02:24:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:18.618127 | orchestrator | 2026-04-07 02:24:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:21.668935 | orchestrator | 2026-04-07 02:24:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:21.669327 | orchestrator | 2026-04-07 02:24:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:21.669351 | orchestrator | 2026-04-07 02:24:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:24.715653 | orchestrator | 2026-04-07 02:24:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:24.717201 | orchestrator | 2026-04-07 02:24:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:24.717371 | orchestrator | 2026-04-07 02:24:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:27.762771 | orchestrator | 2026-04-07 02:24:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:27.763926 | orchestrator | 2026-04-07 02:24:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:27.764069 | orchestrator | 2026-04-07 02:24:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:30.813795 | orchestrator | 2026-04-07 02:24:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:30.815967 | orchestrator | 2026-04-07 02:24:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:30.816031 | orchestrator | 2026-04-07 02:24:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:33.863311 | orchestrator | 2026-04-07 02:24:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:33.865195 | orchestrator | 2026-04-07 02:24:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:33.865266 | orchestrator | 2026-04-07 02:24:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:36.924461 | orchestrator | 2026-04-07 02:24:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:36.926867 | orchestrator | 2026-04-07 02:24:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:36.926934 | orchestrator | 2026-04-07 02:24:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:39.977566 | orchestrator | 2026-04-07 02:24:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:39.980828 | orchestrator | 2026-04-07 02:24:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:39.980891 | orchestrator | 2026-04-07 02:24:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:43.030182 | orchestrator | 2026-04-07 02:24:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:43.030540 | orchestrator | 2026-04-07 02:24:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:43.030670 | orchestrator | 2026-04-07 02:24:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:46.083575 | orchestrator | 2026-04-07 02:24:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:46.085256 | orchestrator | 2026-04-07 02:24:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:46.085630 | orchestrator | 2026-04-07 02:24:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:49.132446 | orchestrator | 2026-04-07 02:24:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:49.135498 | orchestrator | 2026-04-07 02:24:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:49.135594 | orchestrator | 2026-04-07 02:24:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:52.176578 | orchestrator | 2026-04-07 02:24:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:52.177800 | orchestrator | 2026-04-07 02:24:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:52.178056 | orchestrator | 2026-04-07 02:24:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:55.225786 | orchestrator | 2026-04-07 02:24:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:55.228394 | orchestrator | 2026-04-07 02:24:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:55.228447 | orchestrator | 2026-04-07 02:24:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:24:58.280438 | orchestrator | 2026-04-07 02:24:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:24:58.282499 | orchestrator | 2026-04-07 02:24:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:24:58.282577 | orchestrator | 2026-04-07 02:24:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:01.335706 | orchestrator | 2026-04-07 02:25:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:01.337166 | orchestrator | 2026-04-07 02:25:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:01.337264 | orchestrator | 2026-04-07 02:25:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:04.387663 | orchestrator | 2026-04-07 02:25:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:04.390555 | orchestrator | 2026-04-07 02:25:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:04.390636 | orchestrator | 2026-04-07 02:25:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:07.434690 | orchestrator | 2026-04-07 02:25:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:07.435963 | orchestrator | 2026-04-07 02:25:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:07.436025 | orchestrator | 2026-04-07 02:25:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:10.481327 | orchestrator | 2026-04-07 02:25:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:10.484307 | orchestrator | 2026-04-07 02:25:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:10.484367 | orchestrator | 2026-04-07 02:25:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:13.537525 | orchestrator | 2026-04-07 02:25:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:13.539094 | orchestrator | 2026-04-07 02:25:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:13.539156 | orchestrator | 2026-04-07 02:25:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:16.589696 | orchestrator | 2026-04-07 02:25:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:16.591147 | orchestrator | 2026-04-07 02:25:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:16.591216 | orchestrator | 2026-04-07 02:25:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:19.640442 | orchestrator | 2026-04-07 02:25:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:19.643168 | orchestrator | 2026-04-07 02:25:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:19.643354 | orchestrator | 2026-04-07 02:25:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:22.695464 | orchestrator | 2026-04-07 02:25:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:22.696544 | orchestrator | 2026-04-07 02:25:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:22.696579 | orchestrator | 2026-04-07 02:25:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:25.742638 | orchestrator | 2026-04-07 02:25:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:25.744464 | orchestrator | 2026-04-07 02:25:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:25.744524 | orchestrator | 2026-04-07 02:25:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:28.796679 | orchestrator | 2026-04-07 02:25:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:28.798302 | orchestrator | 2026-04-07 02:25:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:28.798376 | orchestrator | 2026-04-07 02:25:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:31.845944 | orchestrator | 2026-04-07 02:25:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:31.848458 | orchestrator | 2026-04-07 02:25:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:31.848524 | orchestrator | 2026-04-07 02:25:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:34.899343 | orchestrator | 2026-04-07 02:25:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:34.900425 | orchestrator | 2026-04-07 02:25:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:34.900453 | orchestrator | 2026-04-07 02:25:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:37.953797 | orchestrator | 2026-04-07 02:25:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:37.955019 | orchestrator | 2026-04-07 02:25:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:37.955183 | orchestrator | 2026-04-07 02:25:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:41.006662 | orchestrator | 2026-04-07 02:25:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:41.008485 | orchestrator | 2026-04-07 02:25:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:41.008575 | orchestrator | 2026-04-07 02:25:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:44.058604 | orchestrator | 2026-04-07 02:25:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:44.060415 | orchestrator | 2026-04-07 02:25:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:44.060562 | orchestrator | 2026-04-07 02:25:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:47.105198 | orchestrator | 2026-04-07 02:25:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:47.106950 | orchestrator | 2026-04-07 02:25:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:47.107111 | orchestrator | 2026-04-07 02:25:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:50.159331 | orchestrator | 2026-04-07 02:25:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:50.161829 | orchestrator | 2026-04-07 02:25:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:50.161902 | orchestrator | 2026-04-07 02:25:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:53.217196 | orchestrator | 2026-04-07 02:25:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:53.219227 | orchestrator | 2026-04-07 02:25:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:53.219304 | orchestrator | 2026-04-07 02:25:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:56.268312 | orchestrator | 2026-04-07 02:25:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:56.273731 | orchestrator | 2026-04-07 02:25:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:56.273820 | orchestrator | 2026-04-07 02:25:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:25:59.322567 | orchestrator | 2026-04-07 02:25:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:25:59.323918 | orchestrator | 2026-04-07 02:25:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:25:59.325375 | orchestrator | 2026-04-07 02:25:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:02.369578 | orchestrator | 2026-04-07 02:26:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:02.372108 | orchestrator | 2026-04-07 02:26:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:02.372187 | orchestrator | 2026-04-07 02:26:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:05.419027 | orchestrator | 2026-04-07 02:26:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:05.421905 | orchestrator | 2026-04-07 02:26:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:05.421959 | orchestrator | 2026-04-07 02:26:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:08.470643 | orchestrator | 2026-04-07 02:26:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:08.473728 | orchestrator | 2026-04-07 02:26:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:08.473876 | orchestrator | 2026-04-07 02:26:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:11.520769 | orchestrator | 2026-04-07 02:26:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:11.523442 | orchestrator | 2026-04-07 02:26:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:11.523513 | orchestrator | 2026-04-07 02:26:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:14.568410 | orchestrator | 2026-04-07 02:26:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:14.570889 | orchestrator | 2026-04-07 02:26:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:14.570953 | orchestrator | 2026-04-07 02:26:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:17.618552 | orchestrator | 2026-04-07 02:26:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:17.620490 | orchestrator | 2026-04-07 02:26:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:17.620574 | orchestrator | 2026-04-07 02:26:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:20.661883 | orchestrator | 2026-04-07 02:26:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:20.663080 | orchestrator | 2026-04-07 02:26:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:20.663143 | orchestrator | 2026-04-07 02:26:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:23.713717 | orchestrator | 2026-04-07 02:26:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:23.715972 | orchestrator | 2026-04-07 02:26:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:23.716029 | orchestrator | 2026-04-07 02:26:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:26.763984 | orchestrator | 2026-04-07 02:26:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:26.765680 | orchestrator | 2026-04-07 02:26:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:26.765720 | orchestrator | 2026-04-07 02:26:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:29.822366 | orchestrator | 2026-04-07 02:26:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:29.825479 | orchestrator | 2026-04-07 02:26:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:29.826043 | orchestrator | 2026-04-07 02:26:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:32.884518 | orchestrator | 2026-04-07 02:26:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:32.885988 | orchestrator | 2026-04-07 02:26:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:32.886083 | orchestrator | 2026-04-07 02:26:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:35.937835 | orchestrator | 2026-04-07 02:26:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:35.940094 | orchestrator | 2026-04-07 02:26:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:35.940140 | orchestrator | 2026-04-07 02:26:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:38.990736 | orchestrator | 2026-04-07 02:26:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:38.993807 | orchestrator | 2026-04-07 02:26:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:38.993919 | orchestrator | 2026-04-07 02:26:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:42.042962 | orchestrator | 2026-04-07 02:26:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:42.045087 | orchestrator | 2026-04-07 02:26:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:42.045174 | orchestrator | 2026-04-07 02:26:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:45.097812 | orchestrator | 2026-04-07 02:26:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:45.099561 | orchestrator | 2026-04-07 02:26:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:45.099607 | orchestrator | 2026-04-07 02:26:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:48.155907 | orchestrator | 2026-04-07 02:26:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:48.157284 | orchestrator | 2026-04-07 02:26:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:48.157334 | orchestrator | 2026-04-07 02:26:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:51.203530 | orchestrator | 2026-04-07 02:26:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:51.205111 | orchestrator | 2026-04-07 02:26:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:51.205172 | orchestrator | 2026-04-07 02:26:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:54.257190 | orchestrator | 2026-04-07 02:26:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:54.260096 | orchestrator | 2026-04-07 02:26:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:54.260344 | orchestrator | 2026-04-07 02:26:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:26:57.312287 | orchestrator | 2026-04-07 02:26:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:26:57.314470 | orchestrator | 2026-04-07 02:26:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:26:57.314584 | orchestrator | 2026-04-07 02:26:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:00.363049 | orchestrator | 2026-04-07 02:27:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:00.365128 | orchestrator | 2026-04-07 02:27:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:00.365281 | orchestrator | 2026-04-07 02:27:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:03.417584 | orchestrator | 2026-04-07 02:27:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:03.418525 | orchestrator | 2026-04-07 02:27:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:03.418885 | orchestrator | 2026-04-07 02:27:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:06.462914 | orchestrator | 2026-04-07 02:27:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:06.463658 | orchestrator | 2026-04-07 02:27:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:06.463706 | orchestrator | 2026-04-07 02:27:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:09.512768 | orchestrator | 2026-04-07 02:27:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:09.515054 | orchestrator | 2026-04-07 02:27:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:09.515120 | orchestrator | 2026-04-07 02:27:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:12.562618 | orchestrator | 2026-04-07 02:27:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:12.563969 | orchestrator | 2026-04-07 02:27:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:12.564167 | orchestrator | 2026-04-07 02:27:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:15.614187 | orchestrator | 2026-04-07 02:27:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:15.616701 | orchestrator | 2026-04-07 02:27:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:15.616768 | orchestrator | 2026-04-07 02:27:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:18.666778 | orchestrator | 2026-04-07 02:27:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:18.669443 | orchestrator | 2026-04-07 02:27:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:18.669508 | orchestrator | 2026-04-07 02:27:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:21.715951 | orchestrator | 2026-04-07 02:27:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:21.718116 | orchestrator | 2026-04-07 02:27:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:21.718218 | orchestrator | 2026-04-07 02:27:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:24.766436 | orchestrator | 2026-04-07 02:27:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:24.767061 | orchestrator | 2026-04-07 02:27:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:24.767089 | orchestrator | 2026-04-07 02:27:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:27.807555 | orchestrator | 2026-04-07 02:27:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:27.809872 | orchestrator | 2026-04-07 02:27:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:27.809923 | orchestrator | 2026-04-07 02:27:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:30.862677 | orchestrator | 2026-04-07 02:27:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:30.864411 | orchestrator | 2026-04-07 02:27:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:30.864499 | orchestrator | 2026-04-07 02:27:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:33.920206 | orchestrator | 2026-04-07 02:27:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:33.921438 | orchestrator | 2026-04-07 02:27:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:33.921493 | orchestrator | 2026-04-07 02:27:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:36.969186 | orchestrator | 2026-04-07 02:27:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:36.971565 | orchestrator | 2026-04-07 02:27:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:36.971624 | orchestrator | 2026-04-07 02:27:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:40.023067 | orchestrator | 2026-04-07 02:27:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:40.025820 | orchestrator | 2026-04-07 02:27:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:40.025882 | orchestrator | 2026-04-07 02:27:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:43.074757 | orchestrator | 2026-04-07 02:27:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:43.076081 | orchestrator | 2026-04-07 02:27:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:43.076156 | orchestrator | 2026-04-07 02:27:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:46.124784 | orchestrator | 2026-04-07 02:27:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:46.126774 | orchestrator | 2026-04-07 02:27:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:46.126819 | orchestrator | 2026-04-07 02:27:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:49.179380 | orchestrator | 2026-04-07 02:27:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:49.182157 | orchestrator | 2026-04-07 02:27:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:49.182241 | orchestrator | 2026-04-07 02:27:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:52.232623 | orchestrator | 2026-04-07 02:27:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:52.234523 | orchestrator | 2026-04-07 02:27:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:52.234592 | orchestrator | 2026-04-07 02:27:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:55.283865 | orchestrator | 2026-04-07 02:27:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:55.286290 | orchestrator | 2026-04-07 02:27:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:55.286397 | orchestrator | 2026-04-07 02:27:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:27:58.336473 | orchestrator | 2026-04-07 02:27:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:27:58.338529 | orchestrator | 2026-04-07 02:27:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:27:58.338576 | orchestrator | 2026-04-07 02:27:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:01.389811 | orchestrator | 2026-04-07 02:28:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:01.392305 | orchestrator | 2026-04-07 02:28:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:01.392364 | orchestrator | 2026-04-07 02:28:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:04.445160 | orchestrator | 2026-04-07 02:28:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:04.448481 | orchestrator | 2026-04-07 02:28:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:04.448526 | orchestrator | 2026-04-07 02:28:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:07.489453 | orchestrator | 2026-04-07 02:28:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:07.490855 | orchestrator | 2026-04-07 02:28:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:07.490923 | orchestrator | 2026-04-07 02:28:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:10.537754 | orchestrator | 2026-04-07 02:28:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:10.541461 | orchestrator | 2026-04-07 02:28:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:10.541516 | orchestrator | 2026-04-07 02:28:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:13.589967 | orchestrator | 2026-04-07 02:28:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:13.592667 | orchestrator | 2026-04-07 02:28:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:13.592867 | orchestrator | 2026-04-07 02:28:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:16.642640 | orchestrator | 2026-04-07 02:28:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:16.643522 | orchestrator | 2026-04-07 02:28:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:16.643689 | orchestrator | 2026-04-07 02:28:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:19.693491 | orchestrator | 2026-04-07 02:28:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:19.694836 | orchestrator | 2026-04-07 02:28:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:19.694894 | orchestrator | 2026-04-07 02:28:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:22.742971 | orchestrator | 2026-04-07 02:28:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:22.745213 | orchestrator | 2026-04-07 02:28:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:22.745322 | orchestrator | 2026-04-07 02:28:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:25.795656 | orchestrator | 2026-04-07 02:28:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:25.797551 | orchestrator | 2026-04-07 02:28:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:25.797628 | orchestrator | 2026-04-07 02:28:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:28.848381 | orchestrator | 2026-04-07 02:28:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:28.851333 | orchestrator | 2026-04-07 02:28:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:28.851804 | orchestrator | 2026-04-07 02:28:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:31.898961 | orchestrator | 2026-04-07 02:28:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:31.901307 | orchestrator | 2026-04-07 02:28:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:31.901372 | orchestrator | 2026-04-07 02:28:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:34.947214 | orchestrator | 2026-04-07 02:28:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:34.949646 | orchestrator | 2026-04-07 02:28:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:34.949735 | orchestrator | 2026-04-07 02:28:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:37.994089 | orchestrator | 2026-04-07 02:28:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:37.995211 | orchestrator | 2026-04-07 02:28:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:37.995382 | orchestrator | 2026-04-07 02:28:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:41.049988 | orchestrator | 2026-04-07 02:28:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:41.051530 | orchestrator | 2026-04-07 02:28:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:41.051554 | orchestrator | 2026-04-07 02:28:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:44.099621 | orchestrator | 2026-04-07 02:28:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:44.101742 | orchestrator | 2026-04-07 02:28:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:44.101797 | orchestrator | 2026-04-07 02:28:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:47.146528 | orchestrator | 2026-04-07 02:28:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:47.148859 | orchestrator | 2026-04-07 02:28:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:47.149833 | orchestrator | 2026-04-07 02:28:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:50.201120 | orchestrator | 2026-04-07 02:28:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:50.203454 | orchestrator | 2026-04-07 02:28:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:50.203569 | orchestrator | 2026-04-07 02:28:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:53.261630 | orchestrator | 2026-04-07 02:28:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:53.263515 | orchestrator | 2026-04-07 02:28:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:53.263568 | orchestrator | 2026-04-07 02:28:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:56.314248 | orchestrator | 2026-04-07 02:28:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:56.317260 | orchestrator | 2026-04-07 02:28:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:56.317348 | orchestrator | 2026-04-07 02:28:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:28:59.370826 | orchestrator | 2026-04-07 02:28:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:28:59.372527 | orchestrator | 2026-04-07 02:28:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:28:59.372570 | orchestrator | 2026-04-07 02:28:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:02.429656 | orchestrator | 2026-04-07 02:29:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:02.431535 | orchestrator | 2026-04-07 02:29:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:02.431632 | orchestrator | 2026-04-07 02:29:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:05.494132 | orchestrator | 2026-04-07 02:29:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:05.495930 | orchestrator | 2026-04-07 02:29:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:05.495988 | orchestrator | 2026-04-07 02:29:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:08.544104 | orchestrator | 2026-04-07 02:29:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:08.546380 | orchestrator | 2026-04-07 02:29:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:08.546459 | orchestrator | 2026-04-07 02:29:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:11.589681 | orchestrator | 2026-04-07 02:29:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:11.591029 | orchestrator | 2026-04-07 02:29:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:11.591071 | orchestrator | 2026-04-07 02:29:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:14.641093 | orchestrator | 2026-04-07 02:29:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:14.642654 | orchestrator | 2026-04-07 02:29:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:14.642886 | orchestrator | 2026-04-07 02:29:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:17.685596 | orchestrator | 2026-04-07 02:29:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:17.687070 | orchestrator | 2026-04-07 02:29:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:17.687152 | orchestrator | 2026-04-07 02:29:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:20.730848 | orchestrator | 2026-04-07 02:29:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:20.732853 | orchestrator | 2026-04-07 02:29:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:20.732965 | orchestrator | 2026-04-07 02:29:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:23.772540 | orchestrator | 2026-04-07 02:29:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:23.775567 | orchestrator | 2026-04-07 02:29:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:23.775675 | orchestrator | 2026-04-07 02:29:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:26.820596 | orchestrator | 2026-04-07 02:29:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:26.822608 | orchestrator | 2026-04-07 02:29:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:26.822670 | orchestrator | 2026-04-07 02:29:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:29.879051 | orchestrator | 2026-04-07 02:29:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:29.880597 | orchestrator | 2026-04-07 02:29:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:29.880676 | orchestrator | 2026-04-07 02:29:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:32.927330 | orchestrator | 2026-04-07 02:29:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:32.928447 | orchestrator | 2026-04-07 02:29:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:32.928489 | orchestrator | 2026-04-07 02:29:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:35.979635 | orchestrator | 2026-04-07 02:29:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:35.982257 | orchestrator | 2026-04-07 02:29:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:35.982336 | orchestrator | 2026-04-07 02:29:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:39.035012 | orchestrator | 2026-04-07 02:29:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:39.036859 | orchestrator | 2026-04-07 02:29:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:39.036971 | orchestrator | 2026-04-07 02:29:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:42.079336 | orchestrator | 2026-04-07 02:29:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:42.082765 | orchestrator | 2026-04-07 02:29:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:42.082877 | orchestrator | 2026-04-07 02:29:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:45.125523 | orchestrator | 2026-04-07 02:29:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:45.128004 | orchestrator | 2026-04-07 02:29:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:45.128063 | orchestrator | 2026-04-07 02:29:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:48.172468 | orchestrator | 2026-04-07 02:29:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:48.173607 | orchestrator | 2026-04-07 02:29:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:48.173663 | orchestrator | 2026-04-07 02:29:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:51.224402 | orchestrator | 2026-04-07 02:29:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:51.226155 | orchestrator | 2026-04-07 02:29:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:51.226250 | orchestrator | 2026-04-07 02:29:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:54.275089 | orchestrator | 2026-04-07 02:29:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:54.277310 | orchestrator | 2026-04-07 02:29:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:54.277852 | orchestrator | 2026-04-07 02:29:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:29:57.320364 | orchestrator | 2026-04-07 02:29:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:29:57.322428 | orchestrator | 2026-04-07 02:29:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:29:57.322516 | orchestrator | 2026-04-07 02:29:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:00.370370 | orchestrator | 2026-04-07 02:30:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:00.371432 | orchestrator | 2026-04-07 02:30:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:00.371455 | orchestrator | 2026-04-07 02:30:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:03.417860 | orchestrator | 2026-04-07 02:30:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:03.419439 | orchestrator | 2026-04-07 02:30:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:03.419476 | orchestrator | 2026-04-07 02:30:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:06.471639 | orchestrator | 2026-04-07 02:30:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:06.473746 | orchestrator | 2026-04-07 02:30:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:06.473864 | orchestrator | 2026-04-07 02:30:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:09.523545 | orchestrator | 2026-04-07 02:30:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:09.525506 | orchestrator | 2026-04-07 02:30:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:09.525558 | orchestrator | 2026-04-07 02:30:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:12.574715 | orchestrator | 2026-04-07 02:30:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:12.575220 | orchestrator | 2026-04-07 02:30:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:12.575266 | orchestrator | 2026-04-07 02:30:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:15.627054 | orchestrator | 2026-04-07 02:30:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:15.629906 | orchestrator | 2026-04-07 02:30:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:15.629981 | orchestrator | 2026-04-07 02:30:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:18.681564 | orchestrator | 2026-04-07 02:30:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:18.685621 | orchestrator | 2026-04-07 02:30:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:18.685697 | orchestrator | 2026-04-07 02:30:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:21.738414 | orchestrator | 2026-04-07 02:30:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:21.739349 | orchestrator | 2026-04-07 02:30:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:21.739382 | orchestrator | 2026-04-07 02:30:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:24.790364 | orchestrator | 2026-04-07 02:30:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:24.791810 | orchestrator | 2026-04-07 02:30:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:24.792561 | orchestrator | 2026-04-07 02:30:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:27.833423 | orchestrator | 2026-04-07 02:30:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:27.834797 | orchestrator | 2026-04-07 02:30:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:27.834882 | orchestrator | 2026-04-07 02:30:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:30.884843 | orchestrator | 2026-04-07 02:30:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:30.886539 | orchestrator | 2026-04-07 02:30:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:30.886610 | orchestrator | 2026-04-07 02:30:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:33.938003 | orchestrator | 2026-04-07 02:30:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:33.938607 | orchestrator | 2026-04-07 02:30:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:33.938637 | orchestrator | 2026-04-07 02:30:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:36.989526 | orchestrator | 2026-04-07 02:30:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:36.990826 | orchestrator | 2026-04-07 02:30:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:36.990881 | orchestrator | 2026-04-07 02:30:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:40.038230 | orchestrator | 2026-04-07 02:30:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:40.039838 | orchestrator | 2026-04-07 02:30:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:40.039891 | orchestrator | 2026-04-07 02:30:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:43.084139 | orchestrator | 2026-04-07 02:30:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:43.086607 | orchestrator | 2026-04-07 02:30:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:43.086656 | orchestrator | 2026-04-07 02:30:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:46.143136 | orchestrator | 2026-04-07 02:30:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:46.145978 | orchestrator | 2026-04-07 02:30:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:46.146111 | orchestrator | 2026-04-07 02:30:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:49.190271 | orchestrator | 2026-04-07 02:30:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:49.191135 | orchestrator | 2026-04-07 02:30:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:49.191164 | orchestrator | 2026-04-07 02:30:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:52.235934 | orchestrator | 2026-04-07 02:30:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:52.236949 | orchestrator | 2026-04-07 02:30:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:52.236990 | orchestrator | 2026-04-07 02:30:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:55.287540 | orchestrator | 2026-04-07 02:30:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:55.289492 | orchestrator | 2026-04-07 02:30:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:55.289544 | orchestrator | 2026-04-07 02:30:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:30:58.335751 | orchestrator | 2026-04-07 02:30:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:30:58.338655 | orchestrator | 2026-04-07 02:30:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:30:58.338696 | orchestrator | 2026-04-07 02:30:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:01.391866 | orchestrator | 2026-04-07 02:31:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:01.393991 | orchestrator | 2026-04-07 02:31:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:01.394067 | orchestrator | 2026-04-07 02:31:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:04.442983 | orchestrator | 2026-04-07 02:31:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:04.444413 | orchestrator | 2026-04-07 02:31:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:04.444465 | orchestrator | 2026-04-07 02:31:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:07.494899 | orchestrator | 2026-04-07 02:31:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:07.496883 | orchestrator | 2026-04-07 02:31:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:07.496938 | orchestrator | 2026-04-07 02:31:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:10.546788 | orchestrator | 2026-04-07 02:31:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:10.547207 | orchestrator | 2026-04-07 02:31:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:10.547237 | orchestrator | 2026-04-07 02:31:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:13.587589 | orchestrator | 2026-04-07 02:31:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:13.589101 | orchestrator | 2026-04-07 02:31:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:13.589152 | orchestrator | 2026-04-07 02:31:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:16.632830 | orchestrator | 2026-04-07 02:31:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:16.633785 | orchestrator | 2026-04-07 02:31:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:16.633821 | orchestrator | 2026-04-07 02:31:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:19.680647 | orchestrator | 2026-04-07 02:31:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:19.680874 | orchestrator | 2026-04-07 02:31:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:19.680900 | orchestrator | 2026-04-07 02:31:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:22.734384 | orchestrator | 2026-04-07 02:31:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:22.734948 | orchestrator | 2026-04-07 02:31:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:22.735039 | orchestrator | 2026-04-07 02:31:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:25.788457 | orchestrator | 2026-04-07 02:31:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:25.791401 | orchestrator | 2026-04-07 02:31:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:25.791461 | orchestrator | 2026-04-07 02:31:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:28.836466 | orchestrator | 2026-04-07 02:31:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:28.837205 | orchestrator | 2026-04-07 02:31:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:28.837240 | orchestrator | 2026-04-07 02:31:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:31.882641 | orchestrator | 2026-04-07 02:31:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:31.883460 | orchestrator | 2026-04-07 02:31:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:31.883504 | orchestrator | 2026-04-07 02:31:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:34.930656 | orchestrator | 2026-04-07 02:31:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:34.930886 | orchestrator | 2026-04-07 02:31:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:34.930911 | orchestrator | 2026-04-07 02:31:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:37.975821 | orchestrator | 2026-04-07 02:31:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:37.977249 | orchestrator | 2026-04-07 02:31:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:37.977413 | orchestrator | 2026-04-07 02:31:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:41.036744 | orchestrator | 2026-04-07 02:31:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:41.038515 | orchestrator | 2026-04-07 02:31:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:41.038571 | orchestrator | 2026-04-07 02:31:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:44.092576 | orchestrator | 2026-04-07 02:31:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:44.094427 | orchestrator | 2026-04-07 02:31:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:44.094511 | orchestrator | 2026-04-07 02:31:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:47.140656 | orchestrator | 2026-04-07 02:31:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:47.144134 | orchestrator | 2026-04-07 02:31:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:47.144248 | orchestrator | 2026-04-07 02:31:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:50.198258 | orchestrator | 2026-04-07 02:31:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:50.199996 | orchestrator | 2026-04-07 02:31:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:50.200233 | orchestrator | 2026-04-07 02:31:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:53.258434 | orchestrator | 2026-04-07 02:31:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:53.260228 | orchestrator | 2026-04-07 02:31:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:53.260290 | orchestrator | 2026-04-07 02:31:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:56.313867 | orchestrator | 2026-04-07 02:31:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:56.314919 | orchestrator | 2026-04-07 02:31:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:56.314963 | orchestrator | 2026-04-07 02:31:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:31:59.364485 | orchestrator | 2026-04-07 02:31:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:31:59.365209 | orchestrator | 2026-04-07 02:31:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:31:59.365233 | orchestrator | 2026-04-07 02:31:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:02.416679 | orchestrator | 2026-04-07 02:32:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:02.419551 | orchestrator | 2026-04-07 02:32:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:02.419628 | orchestrator | 2026-04-07 02:32:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:05.473584 | orchestrator | 2026-04-07 02:32:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:05.475064 | orchestrator | 2026-04-07 02:32:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:05.475163 | orchestrator | 2026-04-07 02:32:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:08.521191 | orchestrator | 2026-04-07 02:32:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:08.524722 | orchestrator | 2026-04-07 02:32:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:08.524766 | orchestrator | 2026-04-07 02:32:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:11.572427 | orchestrator | 2026-04-07 02:32:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:11.573445 | orchestrator | 2026-04-07 02:32:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:11.573502 | orchestrator | 2026-04-07 02:32:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:14.627417 | orchestrator | 2026-04-07 02:32:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:14.629427 | orchestrator | 2026-04-07 02:32:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:14.629517 | orchestrator | 2026-04-07 02:32:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:17.672872 | orchestrator | 2026-04-07 02:32:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:17.674550 | orchestrator | 2026-04-07 02:32:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:17.674601 | orchestrator | 2026-04-07 02:32:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:20.726645 | orchestrator | 2026-04-07 02:32:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:20.727181 | orchestrator | 2026-04-07 02:32:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:20.727675 | orchestrator | 2026-04-07 02:32:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:23.776266 | orchestrator | 2026-04-07 02:32:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:23.778459 | orchestrator | 2026-04-07 02:32:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:23.778531 | orchestrator | 2026-04-07 02:32:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:26.828914 | orchestrator | 2026-04-07 02:32:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:26.830832 | orchestrator | 2026-04-07 02:32:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:26.831127 | orchestrator | 2026-04-07 02:32:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:29.886641 | orchestrator | 2026-04-07 02:32:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:29.888649 | orchestrator | 2026-04-07 02:32:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:29.888805 | orchestrator | 2026-04-07 02:32:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:32.942693 | orchestrator | 2026-04-07 02:32:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:32.945998 | orchestrator | 2026-04-07 02:32:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:32.946118 | orchestrator | 2026-04-07 02:32:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:35.995040 | orchestrator | 2026-04-07 02:32:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:35.996267 | orchestrator | 2026-04-07 02:32:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:35.996444 | orchestrator | 2026-04-07 02:32:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:39.041598 | orchestrator | 2026-04-07 02:32:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:39.043822 | orchestrator | 2026-04-07 02:32:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:39.043898 | orchestrator | 2026-04-07 02:32:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:42.091772 | orchestrator | 2026-04-07 02:32:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:42.093501 | orchestrator | 2026-04-07 02:32:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:42.093577 | orchestrator | 2026-04-07 02:32:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:45.145012 | orchestrator | 2026-04-07 02:32:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:45.146937 | orchestrator | 2026-04-07 02:32:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:45.146970 | orchestrator | 2026-04-07 02:32:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:48.191872 | orchestrator | 2026-04-07 02:32:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:48.194217 | orchestrator | 2026-04-07 02:32:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:48.194313 | orchestrator | 2026-04-07 02:32:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:51.240841 | orchestrator | 2026-04-07 02:32:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:51.243595 | orchestrator | 2026-04-07 02:32:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:51.243777 | orchestrator | 2026-04-07 02:32:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:54.293904 | orchestrator | 2026-04-07 02:32:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:54.295716 | orchestrator | 2026-04-07 02:32:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:54.295794 | orchestrator | 2026-04-07 02:32:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:32:57.342425 | orchestrator | 2026-04-07 02:32:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:32:57.344286 | orchestrator | 2026-04-07 02:32:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:32:57.344331 | orchestrator | 2026-04-07 02:32:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:00.382630 | orchestrator | 2026-04-07 02:33:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:00.385285 | orchestrator | 2026-04-07 02:33:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:00.385348 | orchestrator | 2026-04-07 02:33:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:03.438518 | orchestrator | 2026-04-07 02:33:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:03.442301 | orchestrator | 2026-04-07 02:33:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:03.442382 | orchestrator | 2026-04-07 02:33:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:06.498882 | orchestrator | 2026-04-07 02:33:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:06.500966 | orchestrator | 2026-04-07 02:33:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:06.501028 | orchestrator | 2026-04-07 02:33:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:09.546304 | orchestrator | 2026-04-07 02:33:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:09.547456 | orchestrator | 2026-04-07 02:33:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:09.547564 | orchestrator | 2026-04-07 02:33:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:12.605318 | orchestrator | 2026-04-07 02:33:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:12.609620 | orchestrator | 2026-04-07 02:33:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:12.609691 | orchestrator | 2026-04-07 02:33:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:15.663277 | orchestrator | 2026-04-07 02:33:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:15.665095 | orchestrator | 2026-04-07 02:33:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:15.665179 | orchestrator | 2026-04-07 02:33:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:18.718667 | orchestrator | 2026-04-07 02:33:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:18.723509 | orchestrator | 2026-04-07 02:33:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:18.723653 | orchestrator | 2026-04-07 02:33:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:21.775295 | orchestrator | 2026-04-07 02:33:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:21.779761 | orchestrator | 2026-04-07 02:33:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:21.779963 | orchestrator | 2026-04-07 02:33:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:24.832940 | orchestrator | 2026-04-07 02:33:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:24.835693 | orchestrator | 2026-04-07 02:33:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:24.835787 | orchestrator | 2026-04-07 02:33:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:27.890830 | orchestrator | 2026-04-07 02:33:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:27.892728 | orchestrator | 2026-04-07 02:33:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:27.892780 | orchestrator | 2026-04-07 02:33:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:30.937742 | orchestrator | 2026-04-07 02:33:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:30.939152 | orchestrator | 2026-04-07 02:33:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:30.939182 | orchestrator | 2026-04-07 02:33:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:33.993961 | orchestrator | 2026-04-07 02:33:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:33.995963 | orchestrator | 2026-04-07 02:33:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:33.995988 | orchestrator | 2026-04-07 02:33:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:37.044084 | orchestrator | 2026-04-07 02:33:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:37.045911 | orchestrator | 2026-04-07 02:33:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:37.046067 | orchestrator | 2026-04-07 02:33:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:40.091177 | orchestrator | 2026-04-07 02:33:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:40.091980 | orchestrator | 2026-04-07 02:33:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:40.092013 | orchestrator | 2026-04-07 02:33:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:43.146642 | orchestrator | 2026-04-07 02:33:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:43.149687 | orchestrator | 2026-04-07 02:33:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:43.149754 | orchestrator | 2026-04-07 02:33:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:46.203547 | orchestrator | 2026-04-07 02:33:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:46.206049 | orchestrator | 2026-04-07 02:33:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:46.206096 | orchestrator | 2026-04-07 02:33:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:49.255122 | orchestrator | 2026-04-07 02:33:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:49.257511 | orchestrator | 2026-04-07 02:33:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:49.257570 | orchestrator | 2026-04-07 02:33:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:52.308036 | orchestrator | 2026-04-07 02:33:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:52.310760 | orchestrator | 2026-04-07 02:33:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:52.310839 | orchestrator | 2026-04-07 02:33:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:55.359647 | orchestrator | 2026-04-07 02:33:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:55.361401 | orchestrator | 2026-04-07 02:33:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:55.361432 | orchestrator | 2026-04-07 02:33:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:33:58.408305 | orchestrator | 2026-04-07 02:33:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:33:58.411243 | orchestrator | 2026-04-07 02:33:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:33:58.411317 | orchestrator | 2026-04-07 02:33:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:01.460850 | orchestrator | 2026-04-07 02:34:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:01.463485 | orchestrator | 2026-04-07 02:34:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:01.463568 | orchestrator | 2026-04-07 02:34:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:04.516130 | orchestrator | 2026-04-07 02:34:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:04.517691 | orchestrator | 2026-04-07 02:34:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:04.517759 | orchestrator | 2026-04-07 02:34:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:07.560701 | orchestrator | 2026-04-07 02:34:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:07.564169 | orchestrator | 2026-04-07 02:34:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:07.564235 | orchestrator | 2026-04-07 02:34:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:10.602675 | orchestrator | 2026-04-07 02:34:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:10.604638 | orchestrator | 2026-04-07 02:34:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:10.604730 | orchestrator | 2026-04-07 02:34:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:13.647584 | orchestrator | 2026-04-07 02:34:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:13.649177 | orchestrator | 2026-04-07 02:34:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:13.649225 | orchestrator | 2026-04-07 02:34:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:16.689128 | orchestrator | 2026-04-07 02:34:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:16.690358 | orchestrator | 2026-04-07 02:34:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:16.690473 | orchestrator | 2026-04-07 02:34:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:19.744056 | orchestrator | 2026-04-07 02:34:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:19.745980 | orchestrator | 2026-04-07 02:34:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:19.746125 | orchestrator | 2026-04-07 02:34:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:22.798517 | orchestrator | 2026-04-07 02:34:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:22.800217 | orchestrator | 2026-04-07 02:34:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:22.800302 | orchestrator | 2026-04-07 02:34:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:25.854748 | orchestrator | 2026-04-07 02:34:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:25.856196 | orchestrator | 2026-04-07 02:34:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:25.856250 | orchestrator | 2026-04-07 02:34:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:28.898585 | orchestrator | 2026-04-07 02:34:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:28.900558 | orchestrator | 2026-04-07 02:34:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:28.900576 | orchestrator | 2026-04-07 02:34:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:31.943822 | orchestrator | 2026-04-07 02:34:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:31.945934 | orchestrator | 2026-04-07 02:34:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:31.945995 | orchestrator | 2026-04-07 02:34:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:34.990608 | orchestrator | 2026-04-07 02:34:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:34.992192 | orchestrator | 2026-04-07 02:34:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:34.992343 | orchestrator | 2026-04-07 02:34:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:38.051939 | orchestrator | 2026-04-07 02:34:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:38.054469 | orchestrator | 2026-04-07 02:34:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:38.054526 | orchestrator | 2026-04-07 02:34:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:41.100347 | orchestrator | 2026-04-07 02:34:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:41.101953 | orchestrator | 2026-04-07 02:34:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:41.102001 | orchestrator | 2026-04-07 02:34:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:44.145636 | orchestrator | 2026-04-07 02:34:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:44.147571 | orchestrator | 2026-04-07 02:34:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:44.147626 | orchestrator | 2026-04-07 02:34:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:47.190257 | orchestrator | 2026-04-07 02:34:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:47.190964 | orchestrator | 2026-04-07 02:34:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:47.191351 | orchestrator | 2026-04-07 02:34:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:50.234780 | orchestrator | 2026-04-07 02:34:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:50.235830 | orchestrator | 2026-04-07 02:34:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:50.235896 | orchestrator | 2026-04-07 02:34:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:53.281509 | orchestrator | 2026-04-07 02:34:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:53.283320 | orchestrator | 2026-04-07 02:34:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:53.283448 | orchestrator | 2026-04-07 02:34:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:56.329266 | orchestrator | 2026-04-07 02:34:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:56.330738 | orchestrator | 2026-04-07 02:34:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:56.330788 | orchestrator | 2026-04-07 02:34:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:34:59.382713 | orchestrator | 2026-04-07 02:34:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:34:59.385188 | orchestrator | 2026-04-07 02:34:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:34:59.385319 | orchestrator | 2026-04-07 02:34:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:02.437498 | orchestrator | 2026-04-07 02:35:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:02.439457 | orchestrator | 2026-04-07 02:35:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:02.439492 | orchestrator | 2026-04-07 02:35:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:05.485478 | orchestrator | 2026-04-07 02:35:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:05.485881 | orchestrator | 2026-04-07 02:35:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:05.485932 | orchestrator | 2026-04-07 02:35:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:08.536995 | orchestrator | 2026-04-07 02:35:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:08.538457 | orchestrator | 2026-04-07 02:35:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:08.538511 | orchestrator | 2026-04-07 02:35:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:11.581149 | orchestrator | 2026-04-07 02:35:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:11.582625 | orchestrator | 2026-04-07 02:35:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:11.582700 | orchestrator | 2026-04-07 02:35:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:14.636601 | orchestrator | 2026-04-07 02:35:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:14.638685 | orchestrator | 2026-04-07 02:35:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:14.638749 | orchestrator | 2026-04-07 02:35:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:17.686766 | orchestrator | 2026-04-07 02:35:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:17.688353 | orchestrator | 2026-04-07 02:35:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:17.688494 | orchestrator | 2026-04-07 02:35:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:20.737365 | orchestrator | 2026-04-07 02:35:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:20.738731 | orchestrator | 2026-04-07 02:35:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:20.738797 | orchestrator | 2026-04-07 02:35:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:23.786846 | orchestrator | 2026-04-07 02:35:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:23.788563 | orchestrator | 2026-04-07 02:35:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:23.788624 | orchestrator | 2026-04-07 02:35:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:26.839872 | orchestrator | 2026-04-07 02:35:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:26.841486 | orchestrator | 2026-04-07 02:35:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:26.841533 | orchestrator | 2026-04-07 02:35:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:29.895800 | orchestrator | 2026-04-07 02:35:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:29.897225 | orchestrator | 2026-04-07 02:35:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:29.897286 | orchestrator | 2026-04-07 02:35:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:32.943989 | orchestrator | 2026-04-07 02:35:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:32.945438 | orchestrator | 2026-04-07 02:35:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:32.945477 | orchestrator | 2026-04-07 02:35:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:35.993953 | orchestrator | 2026-04-07 02:35:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:35.995840 | orchestrator | 2026-04-07 02:35:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:35.995917 | orchestrator | 2026-04-07 02:35:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:39.042955 | orchestrator | 2026-04-07 02:35:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:35:39.044295 | orchestrator | 2026-04-07 02:35:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:35:39.044458 | orchestrator | 2026-04-07 02:35:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:35:42.097679 | orchestrator | 2026-04-07 02:35:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:37:42.199381 | orchestrator | 2026-04-07 02:37:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:37:42.199515 | orchestrator | 2026-04-07 02:37:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:37:45.251123 | orchestrator | 2026-04-07 02:37:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:37:45.254171 | orchestrator | 2026-04-07 02:37:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:37:45.254220 | orchestrator | 2026-04-07 02:37:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:37:48.299102 | orchestrator | 2026-04-07 02:37:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:37:48.301687 | orchestrator | 2026-04-07 02:37:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:37:48.301747 | orchestrator | 2026-04-07 02:37:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:37:51.346451 | orchestrator | 2026-04-07 02:37:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:37:51.348218 | orchestrator | 2026-04-07 02:37:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:37:51.348264 | orchestrator | 2026-04-07 02:37:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:37:54.396199 | orchestrator | 2026-04-07 02:37:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:37:54.397534 | orchestrator | 2026-04-07 02:37:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:37:54.397562 | orchestrator | 2026-04-07 02:37:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:37:57.443812 | orchestrator | 2026-04-07 02:37:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:37:57.444279 | orchestrator | 2026-04-07 02:37:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:37:57.444929 | orchestrator | 2026-04-07 02:37:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:00.491192 | orchestrator | 2026-04-07 02:38:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:00.493455 | orchestrator | 2026-04-07 02:38:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:00.493565 | orchestrator | 2026-04-07 02:38:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:03.541545 | orchestrator | 2026-04-07 02:38:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:03.544222 | orchestrator | 2026-04-07 02:38:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:03.544256 | orchestrator | 2026-04-07 02:38:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:06.589044 | orchestrator | 2026-04-07 02:38:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:06.590440 | orchestrator | 2026-04-07 02:38:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:06.590760 | orchestrator | 2026-04-07 02:38:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:09.637254 | orchestrator | 2026-04-07 02:38:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:09.639783 | orchestrator | 2026-04-07 02:38:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:09.639844 | orchestrator | 2026-04-07 02:38:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:12.683064 | orchestrator | 2026-04-07 02:38:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:12.684796 | orchestrator | 2026-04-07 02:38:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:12.684871 | orchestrator | 2026-04-07 02:38:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:15.732114 | orchestrator | 2026-04-07 02:38:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:15.733822 | orchestrator | 2026-04-07 02:38:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:15.733880 | orchestrator | 2026-04-07 02:38:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:18.766499 | orchestrator | 2026-04-07 02:38:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:18.767468 | orchestrator | 2026-04-07 02:38:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:18.767557 | orchestrator | 2026-04-07 02:38:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:21.803790 | orchestrator | 2026-04-07 02:38:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:21.804703 | orchestrator | 2026-04-07 02:38:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:21.804731 | orchestrator | 2026-04-07 02:38:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:24.847639 | orchestrator | 2026-04-07 02:38:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:24.849760 | orchestrator | 2026-04-07 02:38:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:24.849817 | orchestrator | 2026-04-07 02:38:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:27.888636 | orchestrator | 2026-04-07 02:38:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:27.889802 | orchestrator | 2026-04-07 02:38:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:27.889840 | orchestrator | 2026-04-07 02:38:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:30.927422 | orchestrator | 2026-04-07 02:38:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:30.929036 | orchestrator | 2026-04-07 02:38:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:30.929087 | orchestrator | 2026-04-07 02:38:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:33.979869 | orchestrator | 2026-04-07 02:38:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:33.980733 | orchestrator | 2026-04-07 02:38:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:33.980794 | orchestrator | 2026-04-07 02:38:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:37.040207 | orchestrator | 2026-04-07 02:38:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:37.042574 | orchestrator | 2026-04-07 02:38:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:37.042643 | orchestrator | 2026-04-07 02:38:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:40.094981 | orchestrator | 2026-04-07 02:38:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:40.097454 | orchestrator | 2026-04-07 02:38:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:40.097779 | orchestrator | 2026-04-07 02:38:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:43.136012 | orchestrator | 2026-04-07 02:38:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:43.136397 | orchestrator | 2026-04-07 02:38:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:43.136416 | orchestrator | 2026-04-07 02:38:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:46.182123 | orchestrator | 2026-04-07 02:38:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:46.182257 | orchestrator | 2026-04-07 02:38:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:46.182283 | orchestrator | 2026-04-07 02:38:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:49.223378 | orchestrator | 2026-04-07 02:38:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:49.225694 | orchestrator | 2026-04-07 02:38:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:49.225779 | orchestrator | 2026-04-07 02:38:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:52.270168 | orchestrator | 2026-04-07 02:38:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:52.271846 | orchestrator | 2026-04-07 02:38:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:52.271887 | orchestrator | 2026-04-07 02:38:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:55.313006 | orchestrator | 2026-04-07 02:38:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:55.313279 | orchestrator | 2026-04-07 02:38:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:55.313310 | orchestrator | 2026-04-07 02:38:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:38:58.361171 | orchestrator | 2026-04-07 02:38:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:38:58.363451 | orchestrator | 2026-04-07 02:38:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:38:58.363954 | orchestrator | 2026-04-07 02:38:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:01.415791 | orchestrator | 2026-04-07 02:39:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:01.417475 | orchestrator | 2026-04-07 02:39:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:01.417535 | orchestrator | 2026-04-07 02:39:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:04.468804 | orchestrator | 2026-04-07 02:39:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:04.470301 | orchestrator | 2026-04-07 02:39:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:04.470781 | orchestrator | 2026-04-07 02:39:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:07.517632 | orchestrator | 2026-04-07 02:39:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:07.519169 | orchestrator | 2026-04-07 02:39:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:07.519217 | orchestrator | 2026-04-07 02:39:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:10.568417 | orchestrator | 2026-04-07 02:39:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:10.570645 | orchestrator | 2026-04-07 02:39:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:10.570694 | orchestrator | 2026-04-07 02:39:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:13.607576 | orchestrator | 2026-04-07 02:39:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:13.609551 | orchestrator | 2026-04-07 02:39:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:13.609652 | orchestrator | 2026-04-07 02:39:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:16.654656 | orchestrator | 2026-04-07 02:39:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:16.658226 | orchestrator | 2026-04-07 02:39:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:16.658306 | orchestrator | 2026-04-07 02:39:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:19.696782 | orchestrator | 2026-04-07 02:39:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:19.698601 | orchestrator | 2026-04-07 02:39:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:19.698677 | orchestrator | 2026-04-07 02:39:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:22.743585 | orchestrator | 2026-04-07 02:39:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:22.745655 | orchestrator | 2026-04-07 02:39:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:22.745701 | orchestrator | 2026-04-07 02:39:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:25.786815 | orchestrator | 2026-04-07 02:39:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:25.789148 | orchestrator | 2026-04-07 02:39:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:25.789200 | orchestrator | 2026-04-07 02:39:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:28.831603 | orchestrator | 2026-04-07 02:39:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:28.832847 | orchestrator | 2026-04-07 02:39:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:28.833259 | orchestrator | 2026-04-07 02:39:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:31.875018 | orchestrator | 2026-04-07 02:39:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:31.877680 | orchestrator | 2026-04-07 02:39:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:31.877739 | orchestrator | 2026-04-07 02:39:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:34.917414 | orchestrator | 2026-04-07 02:39:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:34.918788 | orchestrator | 2026-04-07 02:39:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:34.918825 | orchestrator | 2026-04-07 02:39:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:37.964030 | orchestrator | 2026-04-07 02:39:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:37.967731 | orchestrator | 2026-04-07 02:39:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:37.967788 | orchestrator | 2026-04-07 02:39:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:41.008476 | orchestrator | 2026-04-07 02:39:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:41.009239 | orchestrator | 2026-04-07 02:39:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:41.009349 | orchestrator | 2026-04-07 02:39:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:44.053253 | orchestrator | 2026-04-07 02:39:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:44.055702 | orchestrator | 2026-04-07 02:39:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:44.055769 | orchestrator | 2026-04-07 02:39:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:47.108026 | orchestrator | 2026-04-07 02:39:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:47.110106 | orchestrator | 2026-04-07 02:39:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:47.110214 | orchestrator | 2026-04-07 02:39:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:50.153821 | orchestrator | 2026-04-07 02:39:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:50.156265 | orchestrator | 2026-04-07 02:39:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:50.156318 | orchestrator | 2026-04-07 02:39:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:53.207294 | orchestrator | 2026-04-07 02:39:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:53.209337 | orchestrator | 2026-04-07 02:39:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:53.209396 | orchestrator | 2026-04-07 02:39:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:56.244961 | orchestrator | 2026-04-07 02:39:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:56.246158 | orchestrator | 2026-04-07 02:39:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:56.246230 | orchestrator | 2026-04-07 02:39:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:39:59.292823 | orchestrator | 2026-04-07 02:39:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:39:59.294808 | orchestrator | 2026-04-07 02:39:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:39:59.294876 | orchestrator | 2026-04-07 02:39:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:02.343952 | orchestrator | 2026-04-07 02:40:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:02.345244 | orchestrator | 2026-04-07 02:40:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:02.345284 | orchestrator | 2026-04-07 02:40:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:05.394101 | orchestrator | 2026-04-07 02:40:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:05.394993 | orchestrator | 2026-04-07 02:40:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:05.395036 | orchestrator | 2026-04-07 02:40:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:08.443497 | orchestrator | 2026-04-07 02:40:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:08.444806 | orchestrator | 2026-04-07 02:40:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:08.445015 | orchestrator | 2026-04-07 02:40:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:11.511424 | orchestrator | 2026-04-07 02:40:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:11.512618 | orchestrator | 2026-04-07 02:40:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:11.512666 | orchestrator | 2026-04-07 02:40:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:14.560760 | orchestrator | 2026-04-07 02:40:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:14.561404 | orchestrator | 2026-04-07 02:40:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:14.561438 | orchestrator | 2026-04-07 02:40:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:17.608859 | orchestrator | 2026-04-07 02:40:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:17.610613 | orchestrator | 2026-04-07 02:40:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:17.610694 | orchestrator | 2026-04-07 02:40:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:20.655779 | orchestrator | 2026-04-07 02:40:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:20.657483 | orchestrator | 2026-04-07 02:40:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:20.657551 | orchestrator | 2026-04-07 02:40:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:23.702852 | orchestrator | 2026-04-07 02:40:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:23.704937 | orchestrator | 2026-04-07 02:40:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:23.704986 | orchestrator | 2026-04-07 02:40:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:26.751233 | orchestrator | 2026-04-07 02:40:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:26.753384 | orchestrator | 2026-04-07 02:40:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:26.753514 | orchestrator | 2026-04-07 02:40:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:29.796455 | orchestrator | 2026-04-07 02:40:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:29.797552 | orchestrator | 2026-04-07 02:40:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:29.797671 | orchestrator | 2026-04-07 02:40:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:32.850676 | orchestrator | 2026-04-07 02:40:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:32.852097 | orchestrator | 2026-04-07 02:40:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:32.852214 | orchestrator | 2026-04-07 02:40:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:35.900155 | orchestrator | 2026-04-07 02:40:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:35.902391 | orchestrator | 2026-04-07 02:40:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:35.902461 | orchestrator | 2026-04-07 02:40:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:38.955197 | orchestrator | 2026-04-07 02:40:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:38.957215 | orchestrator | 2026-04-07 02:40:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:38.957244 | orchestrator | 2026-04-07 02:40:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:42.001976 | orchestrator | 2026-04-07 02:40:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:42.004574 | orchestrator | 2026-04-07 02:40:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:42.004651 | orchestrator | 2026-04-07 02:40:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:45.051061 | orchestrator | 2026-04-07 02:40:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:45.053150 | orchestrator | 2026-04-07 02:40:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:45.053215 | orchestrator | 2026-04-07 02:40:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:48.092193 | orchestrator | 2026-04-07 02:40:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:48.092746 | orchestrator | 2026-04-07 02:40:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:48.092788 | orchestrator | 2026-04-07 02:40:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:51.134304 | orchestrator | 2026-04-07 02:40:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:51.136083 | orchestrator | 2026-04-07 02:40:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:51.136142 | orchestrator | 2026-04-07 02:40:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:54.177930 | orchestrator | 2026-04-07 02:40:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:54.179570 | orchestrator | 2026-04-07 02:40:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:54.179642 | orchestrator | 2026-04-07 02:40:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:40:57.229500 | orchestrator | 2026-04-07 02:40:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:40:57.231263 | orchestrator | 2026-04-07 02:40:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:40:57.231386 | orchestrator | 2026-04-07 02:40:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:00.277091 | orchestrator | 2026-04-07 02:41:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:00.278765 | orchestrator | 2026-04-07 02:41:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:00.278819 | orchestrator | 2026-04-07 02:41:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:03.323244 | orchestrator | 2026-04-07 02:41:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:03.325916 | orchestrator | 2026-04-07 02:41:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:03.326111 | orchestrator | 2026-04-07 02:41:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:06.375904 | orchestrator | 2026-04-07 02:41:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:06.377439 | orchestrator | 2026-04-07 02:41:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:06.377501 | orchestrator | 2026-04-07 02:41:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:09.421097 | orchestrator | 2026-04-07 02:41:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:09.422529 | orchestrator | 2026-04-07 02:41:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:09.422659 | orchestrator | 2026-04-07 02:41:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:12.469998 | orchestrator | 2026-04-07 02:41:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:12.471867 | orchestrator | 2026-04-07 02:41:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:12.471921 | orchestrator | 2026-04-07 02:41:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:15.522517 | orchestrator | 2026-04-07 02:41:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:15.524765 | orchestrator | 2026-04-07 02:41:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:15.524931 | orchestrator | 2026-04-07 02:41:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:18.574890 | orchestrator | 2026-04-07 02:41:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:18.578244 | orchestrator | 2026-04-07 02:41:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:18.578315 | orchestrator | 2026-04-07 02:41:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:21.621301 | orchestrator | 2026-04-07 02:41:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:21.623787 | orchestrator | 2026-04-07 02:41:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:21.623844 | orchestrator | 2026-04-07 02:41:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:24.671128 | orchestrator | 2026-04-07 02:41:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:24.673383 | orchestrator | 2026-04-07 02:41:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:24.673445 | orchestrator | 2026-04-07 02:41:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:27.724990 | orchestrator | 2026-04-07 02:41:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:27.726465 | orchestrator | 2026-04-07 02:41:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:27.726610 | orchestrator | 2026-04-07 02:41:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:30.776847 | orchestrator | 2026-04-07 02:41:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:30.779925 | orchestrator | 2026-04-07 02:41:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:30.779998 | orchestrator | 2026-04-07 02:41:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:33.826930 | orchestrator | 2026-04-07 02:41:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:33.828769 | orchestrator | 2026-04-07 02:41:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:33.828811 | orchestrator | 2026-04-07 02:41:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:36.873098 | orchestrator | 2026-04-07 02:41:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:36.875331 | orchestrator | 2026-04-07 02:41:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:36.875399 | orchestrator | 2026-04-07 02:41:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:39.925108 | orchestrator | 2026-04-07 02:41:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:39.926011 | orchestrator | 2026-04-07 02:41:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:39.926126 | orchestrator | 2026-04-07 02:41:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:42.966883 | orchestrator | 2026-04-07 02:41:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:42.968046 | orchestrator | 2026-04-07 02:41:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:42.968083 | orchestrator | 2026-04-07 02:41:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:46.016892 | orchestrator | 2026-04-07 02:41:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:46.017215 | orchestrator | 2026-04-07 02:41:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:46.017248 | orchestrator | 2026-04-07 02:41:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:49.068137 | orchestrator | 2026-04-07 02:41:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:49.070011 | orchestrator | 2026-04-07 02:41:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:49.070157 | orchestrator | 2026-04-07 02:41:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:52.117846 | orchestrator | 2026-04-07 02:41:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:52.119959 | orchestrator | 2026-04-07 02:41:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:52.120041 | orchestrator | 2026-04-07 02:41:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:55.170923 | orchestrator | 2026-04-07 02:41:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:55.172628 | orchestrator | 2026-04-07 02:41:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:55.172692 | orchestrator | 2026-04-07 02:41:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:41:58.221716 | orchestrator | 2026-04-07 02:41:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:41:58.223536 | orchestrator | 2026-04-07 02:41:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:41:58.223615 | orchestrator | 2026-04-07 02:41:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:01.274280 | orchestrator | 2026-04-07 02:42:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:01.277178 | orchestrator | 2026-04-07 02:42:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:01.277259 | orchestrator | 2026-04-07 02:42:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:04.323803 | orchestrator | 2026-04-07 02:42:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:04.325879 | orchestrator | 2026-04-07 02:42:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:04.325934 | orchestrator | 2026-04-07 02:42:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:07.370889 | orchestrator | 2026-04-07 02:42:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:07.371914 | orchestrator | 2026-04-07 02:42:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:07.372283 | orchestrator | 2026-04-07 02:42:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:10.417905 | orchestrator | 2026-04-07 02:42:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:10.418917 | orchestrator | 2026-04-07 02:42:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:10.419019 | orchestrator | 2026-04-07 02:42:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:13.465151 | orchestrator | 2026-04-07 02:42:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:13.466809 | orchestrator | 2026-04-07 02:42:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:13.466895 | orchestrator | 2026-04-07 02:42:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:16.511808 | orchestrator | 2026-04-07 02:42:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:16.514934 | orchestrator | 2026-04-07 02:42:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:16.515031 | orchestrator | 2026-04-07 02:42:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:19.565839 | orchestrator | 2026-04-07 02:42:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:19.567667 | orchestrator | 2026-04-07 02:42:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:19.567715 | orchestrator | 2026-04-07 02:42:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:22.611709 | orchestrator | 2026-04-07 02:42:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:22.612924 | orchestrator | 2026-04-07 02:42:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:22.612981 | orchestrator | 2026-04-07 02:42:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:25.662783 | orchestrator | 2026-04-07 02:42:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:25.664995 | orchestrator | 2026-04-07 02:42:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:25.665099 | orchestrator | 2026-04-07 02:42:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:28.710114 | orchestrator | 2026-04-07 02:42:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:28.711530 | orchestrator | 2026-04-07 02:42:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:28.711600 | orchestrator | 2026-04-07 02:42:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:31.760158 | orchestrator | 2026-04-07 02:42:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:31.761832 | orchestrator | 2026-04-07 02:42:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:31.761927 | orchestrator | 2026-04-07 02:42:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:34.803113 | orchestrator | 2026-04-07 02:42:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:34.803397 | orchestrator | 2026-04-07 02:42:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:34.803425 | orchestrator | 2026-04-07 02:42:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:37.848517 | orchestrator | 2026-04-07 02:42:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:37.850925 | orchestrator | 2026-04-07 02:42:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:37.850994 | orchestrator | 2026-04-07 02:42:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:40.896339 | orchestrator | 2026-04-07 02:42:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:40.899507 | orchestrator | 2026-04-07 02:42:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:40.899835 | orchestrator | 2026-04-07 02:42:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:43.941620 | orchestrator | 2026-04-07 02:42:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:43.942926 | orchestrator | 2026-04-07 02:42:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:43.942993 | orchestrator | 2026-04-07 02:42:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:46.991099 | orchestrator | 2026-04-07 02:42:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:46.992965 | orchestrator | 2026-04-07 02:42:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:46.993014 | orchestrator | 2026-04-07 02:42:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:50.042759 | orchestrator | 2026-04-07 02:42:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:50.045060 | orchestrator | 2026-04-07 02:42:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:50.045122 | orchestrator | 2026-04-07 02:42:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:53.094211 | orchestrator | 2026-04-07 02:42:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:53.096959 | orchestrator | 2026-04-07 02:42:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:53.097033 | orchestrator | 2026-04-07 02:42:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:56.142394 | orchestrator | 2026-04-07 02:42:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:56.143233 | orchestrator | 2026-04-07 02:42:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:56.143279 | orchestrator | 2026-04-07 02:42:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:42:59.190727 | orchestrator | 2026-04-07 02:42:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:42:59.193657 | orchestrator | 2026-04-07 02:42:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:42:59.193734 | orchestrator | 2026-04-07 02:42:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:02.228910 | orchestrator | 2026-04-07 02:43:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:02.229537 | orchestrator | 2026-04-07 02:43:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:02.229727 | orchestrator | 2026-04-07 02:43:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:05.276762 | orchestrator | 2026-04-07 02:43:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:05.278967 | orchestrator | 2026-04-07 02:43:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:05.279142 | orchestrator | 2026-04-07 02:43:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:08.321826 | orchestrator | 2026-04-07 02:43:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:08.323901 | orchestrator | 2026-04-07 02:43:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:08.323951 | orchestrator | 2026-04-07 02:43:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:11.363155 | orchestrator | 2026-04-07 02:43:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:11.365010 | orchestrator | 2026-04-07 02:43:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:11.365078 | orchestrator | 2026-04-07 02:43:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:14.404809 | orchestrator | 2026-04-07 02:43:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:14.407848 | orchestrator | 2026-04-07 02:43:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:14.407920 | orchestrator | 2026-04-07 02:43:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:17.445299 | orchestrator | 2026-04-07 02:43:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:17.448058 | orchestrator | 2026-04-07 02:43:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:17.448106 | orchestrator | 2026-04-07 02:43:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:20.498738 | orchestrator | 2026-04-07 02:43:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:20.500855 | orchestrator | 2026-04-07 02:43:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:20.500905 | orchestrator | 2026-04-07 02:43:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:23.551678 | orchestrator | 2026-04-07 02:43:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:23.553360 | orchestrator | 2026-04-07 02:43:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:23.553409 | orchestrator | 2026-04-07 02:43:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:26.601989 | orchestrator | 2026-04-07 02:43:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:26.607838 | orchestrator | 2026-04-07 02:43:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:26.607951 | orchestrator | 2026-04-07 02:43:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:29.652632 | orchestrator | 2026-04-07 02:43:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:29.654684 | orchestrator | 2026-04-07 02:43:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:29.655091 | orchestrator | 2026-04-07 02:43:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:32.693166 | orchestrator | 2026-04-07 02:43:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:32.693980 | orchestrator | 2026-04-07 02:43:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:32.694489 | orchestrator | 2026-04-07 02:43:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:35.738955 | orchestrator | 2026-04-07 02:43:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:35.742102 | orchestrator | 2026-04-07 02:43:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:35.742213 | orchestrator | 2026-04-07 02:43:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:38.782208 | orchestrator | 2026-04-07 02:43:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:38.783801 | orchestrator | 2026-04-07 02:43:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:38.783858 | orchestrator | 2026-04-07 02:43:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:41.826462 | orchestrator | 2026-04-07 02:43:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:41.827515 | orchestrator | 2026-04-07 02:43:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:41.827701 | orchestrator | 2026-04-07 02:43:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:44.875673 | orchestrator | 2026-04-07 02:43:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:44.878433 | orchestrator | 2026-04-07 02:43:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:44.878505 | orchestrator | 2026-04-07 02:43:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:47.927288 | orchestrator | 2026-04-07 02:43:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:47.929515 | orchestrator | 2026-04-07 02:43:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:47.929993 | orchestrator | 2026-04-07 02:43:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:50.980015 | orchestrator | 2026-04-07 02:43:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:50.982244 | orchestrator | 2026-04-07 02:43:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:50.982264 | orchestrator | 2026-04-07 02:43:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:54.032885 | orchestrator | 2026-04-07 02:43:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:54.034095 | orchestrator | 2026-04-07 02:43:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:54.034151 | orchestrator | 2026-04-07 02:43:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:43:57.075775 | orchestrator | 2026-04-07 02:43:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:43:57.078351 | orchestrator | 2026-04-07 02:43:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:43:57.078460 | orchestrator | 2026-04-07 02:43:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:00.121713 | orchestrator | 2026-04-07 02:44:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:00.121806 | orchestrator | 2026-04-07 02:44:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:00.121858 | orchestrator | 2026-04-07 02:44:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:03.168140 | orchestrator | 2026-04-07 02:44:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:03.169932 | orchestrator | 2026-04-07 02:44:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:03.170099 | orchestrator | 2026-04-07 02:44:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:06.211681 | orchestrator | 2026-04-07 02:44:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:06.214011 | orchestrator | 2026-04-07 02:44:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:06.214238 | orchestrator | 2026-04-07 02:44:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:09.253501 | orchestrator | 2026-04-07 02:44:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:09.254865 | orchestrator | 2026-04-07 02:44:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:09.254930 | orchestrator | 2026-04-07 02:44:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:12.290144 | orchestrator | 2026-04-07 02:44:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:12.291917 | orchestrator | 2026-04-07 02:44:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:12.291991 | orchestrator | 2026-04-07 02:44:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:15.335227 | orchestrator | 2026-04-07 02:44:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:15.335512 | orchestrator | 2026-04-07 02:44:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:15.335537 | orchestrator | 2026-04-07 02:44:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:18.383478 | orchestrator | 2026-04-07 02:44:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:18.384978 | orchestrator | 2026-04-07 02:44:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:18.385039 | orchestrator | 2026-04-07 02:44:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:21.436039 | orchestrator | 2026-04-07 02:44:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:21.437850 | orchestrator | 2026-04-07 02:44:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:21.437898 | orchestrator | 2026-04-07 02:44:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:24.483285 | orchestrator | 2026-04-07 02:44:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:24.484656 | orchestrator | 2026-04-07 02:44:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:24.484704 | orchestrator | 2026-04-07 02:44:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:27.537691 | orchestrator | 2026-04-07 02:44:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:27.539628 | orchestrator | 2026-04-07 02:44:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:27.539659 | orchestrator | 2026-04-07 02:44:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:30.588416 | orchestrator | 2026-04-07 02:44:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:30.590408 | orchestrator | 2026-04-07 02:44:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:30.590460 | orchestrator | 2026-04-07 02:44:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:33.644927 | orchestrator | 2026-04-07 02:44:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:33.646187 | orchestrator | 2026-04-07 02:44:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:33.646222 | orchestrator | 2026-04-07 02:44:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:36.700454 | orchestrator | 2026-04-07 02:44:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:36.704157 | orchestrator | 2026-04-07 02:44:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:36.704310 | orchestrator | 2026-04-07 02:44:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:39.756709 | orchestrator | 2026-04-07 02:44:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:39.759447 | orchestrator | 2026-04-07 02:44:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:39.759670 | orchestrator | 2026-04-07 02:44:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:42.798439 | orchestrator | 2026-04-07 02:44:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:42.799646 | orchestrator | 2026-04-07 02:44:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:42.799693 | orchestrator | 2026-04-07 02:44:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:45.851348 | orchestrator | 2026-04-07 02:44:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:45.852730 | orchestrator | 2026-04-07 02:44:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:45.852766 | orchestrator | 2026-04-07 02:44:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:48.899710 | orchestrator | 2026-04-07 02:44:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:48.903660 | orchestrator | 2026-04-07 02:44:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:48.903719 | orchestrator | 2026-04-07 02:44:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:51.953481 | orchestrator | 2026-04-07 02:44:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:51.955766 | orchestrator | 2026-04-07 02:44:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:51.955825 | orchestrator | 2026-04-07 02:44:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:55.011219 | orchestrator | 2026-04-07 02:44:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:55.014560 | orchestrator | 2026-04-07 02:44:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:55.014705 | orchestrator | 2026-04-07 02:44:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:44:58.066220 | orchestrator | 2026-04-07 02:44:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:44:58.066615 | orchestrator | 2026-04-07 02:44:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:44:58.066636 | orchestrator | 2026-04-07 02:44:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:01.113495 | orchestrator | 2026-04-07 02:45:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:01.114667 | orchestrator | 2026-04-07 02:45:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:01.114718 | orchestrator | 2026-04-07 02:45:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:04.152139 | orchestrator | 2026-04-07 02:45:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:04.153796 | orchestrator | 2026-04-07 02:45:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:04.153842 | orchestrator | 2026-04-07 02:45:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:07.202393 | orchestrator | 2026-04-07 02:45:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:07.204993 | orchestrator | 2026-04-07 02:45:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:07.205105 | orchestrator | 2026-04-07 02:45:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:10.259613 | orchestrator | 2026-04-07 02:45:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:10.261672 | orchestrator | 2026-04-07 02:45:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:10.261721 | orchestrator | 2026-04-07 02:45:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:13.319009 | orchestrator | 2026-04-07 02:45:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:13.321229 | orchestrator | 2026-04-07 02:45:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:13.321275 | orchestrator | 2026-04-07 02:45:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:16.371060 | orchestrator | 2026-04-07 02:45:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:16.371533 | orchestrator | 2026-04-07 02:45:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:16.371573 | orchestrator | 2026-04-07 02:45:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:19.419985 | orchestrator | 2026-04-07 02:45:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:19.421473 | orchestrator | 2026-04-07 02:45:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:19.421533 | orchestrator | 2026-04-07 02:45:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:22.473785 | orchestrator | 2026-04-07 02:45:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:22.477005 | orchestrator | 2026-04-07 02:45:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:22.477073 | orchestrator | 2026-04-07 02:45:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:25.523074 | orchestrator | 2026-04-07 02:45:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:25.525652 | orchestrator | 2026-04-07 02:45:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:25.525710 | orchestrator | 2026-04-07 02:45:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:28.572684 | orchestrator | 2026-04-07 02:45:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:28.575771 | orchestrator | 2026-04-07 02:45:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:28.575836 | orchestrator | 2026-04-07 02:45:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:31.618808 | orchestrator | 2026-04-07 02:45:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:31.619947 | orchestrator | 2026-04-07 02:45:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:31.620052 | orchestrator | 2026-04-07 02:45:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:34.669014 | orchestrator | 2026-04-07 02:45:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:34.671390 | orchestrator | 2026-04-07 02:45:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:34.671452 | orchestrator | 2026-04-07 02:45:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:37.716455 | orchestrator | 2026-04-07 02:45:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:37.719427 | orchestrator | 2026-04-07 02:45:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:37.719477 | orchestrator | 2026-04-07 02:45:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:40.767105 | orchestrator | 2026-04-07 02:45:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:40.769433 | orchestrator | 2026-04-07 02:45:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:40.769501 | orchestrator | 2026-04-07 02:45:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:43.813438 | orchestrator | 2026-04-07 02:45:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:43.816132 | orchestrator | 2026-04-07 02:45:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:43.816269 | orchestrator | 2026-04-07 02:45:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:46.870401 | orchestrator | 2026-04-07 02:45:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:46.873310 | orchestrator | 2026-04-07 02:45:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:46.873374 | orchestrator | 2026-04-07 02:45:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:49.918545 | orchestrator | 2026-04-07 02:45:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:49.920588 | orchestrator | 2026-04-07 02:45:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:49.920679 | orchestrator | 2026-04-07 02:45:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:52.973230 | orchestrator | 2026-04-07 02:45:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:52.975872 | orchestrator | 2026-04-07 02:45:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:52.975973 | orchestrator | 2026-04-07 02:45:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:56.018411 | orchestrator | 2026-04-07 02:45:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:56.020879 | orchestrator | 2026-04-07 02:45:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:56.020948 | orchestrator | 2026-04-07 02:45:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:45:59.065444 | orchestrator | 2026-04-07 02:45:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:45:59.066081 | orchestrator | 2026-04-07 02:45:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:45:59.066195 | orchestrator | 2026-04-07 02:45:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:02.110940 | orchestrator | 2026-04-07 02:46:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:02.112290 | orchestrator | 2026-04-07 02:46:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:02.112344 | orchestrator | 2026-04-07 02:46:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:05.164388 | orchestrator | 2026-04-07 02:46:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:05.165318 | orchestrator | 2026-04-07 02:46:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:05.165365 | orchestrator | 2026-04-07 02:46:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:08.219503 | orchestrator | 2026-04-07 02:46:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:08.220874 | orchestrator | 2026-04-07 02:46:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:08.220922 | orchestrator | 2026-04-07 02:46:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:11.271765 | orchestrator | 2026-04-07 02:46:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:11.275454 | orchestrator | 2026-04-07 02:46:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:11.275586 | orchestrator | 2026-04-07 02:46:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:14.329950 | orchestrator | 2026-04-07 02:46:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:14.330678 | orchestrator | 2026-04-07 02:46:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:14.330762 | orchestrator | 2026-04-07 02:46:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:17.373834 | orchestrator | 2026-04-07 02:46:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:17.376057 | orchestrator | 2026-04-07 02:46:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:17.376116 | orchestrator | 2026-04-07 02:46:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:20.427023 | orchestrator | 2026-04-07 02:46:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:20.429119 | orchestrator | 2026-04-07 02:46:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:20.429164 | orchestrator | 2026-04-07 02:46:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:23.484906 | orchestrator | 2026-04-07 02:46:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:23.486242 | orchestrator | 2026-04-07 02:46:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:23.486297 | orchestrator | 2026-04-07 02:46:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:26.528546 | orchestrator | 2026-04-07 02:46:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:26.530796 | orchestrator | 2026-04-07 02:46:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:26.530855 | orchestrator | 2026-04-07 02:46:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:29.578246 | orchestrator | 2026-04-07 02:46:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:29.580748 | orchestrator | 2026-04-07 02:46:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:29.580790 | orchestrator | 2026-04-07 02:46:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:32.626978 | orchestrator | 2026-04-07 02:46:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:32.628651 | orchestrator | 2026-04-07 02:46:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:32.628690 | orchestrator | 2026-04-07 02:46:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:35.682934 | orchestrator | 2026-04-07 02:46:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:35.684542 | orchestrator | 2026-04-07 02:46:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:35.684565 | orchestrator | 2026-04-07 02:46:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:38.740335 | orchestrator | 2026-04-07 02:46:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:38.743233 | orchestrator | 2026-04-07 02:46:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:38.743299 | orchestrator | 2026-04-07 02:46:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:41.788219 | orchestrator | 2026-04-07 02:46:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:41.791930 | orchestrator | 2026-04-07 02:46:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:41.791992 | orchestrator | 2026-04-07 02:46:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:44.838487 | orchestrator | 2026-04-07 02:46:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:44.840987 | orchestrator | 2026-04-07 02:46:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:44.841017 | orchestrator | 2026-04-07 02:46:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:47.887941 | orchestrator | 2026-04-07 02:46:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:47.889841 | orchestrator | 2026-04-07 02:46:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:47.889895 | orchestrator | 2026-04-07 02:46:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:50.939733 | orchestrator | 2026-04-07 02:46:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:50.940757 | orchestrator | 2026-04-07 02:46:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:50.940806 | orchestrator | 2026-04-07 02:46:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:53.996845 | orchestrator | 2026-04-07 02:46:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:53.998578 | orchestrator | 2026-04-07 02:46:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:53.998668 | orchestrator | 2026-04-07 02:46:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:46:57.040768 | orchestrator | 2026-04-07 02:46:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:46:57.042552 | orchestrator | 2026-04-07 02:46:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:46:57.042612 | orchestrator | 2026-04-07 02:46:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:00.079060 | orchestrator | 2026-04-07 02:47:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:00.080929 | orchestrator | 2026-04-07 02:47:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:00.081003 | orchestrator | 2026-04-07 02:47:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:03.130917 | orchestrator | 2026-04-07 02:47:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:03.131862 | orchestrator | 2026-04-07 02:47:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:03.131909 | orchestrator | 2026-04-07 02:47:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:06.174006 | orchestrator | 2026-04-07 02:47:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:06.174951 | orchestrator | 2026-04-07 02:47:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:06.174987 | orchestrator | 2026-04-07 02:47:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:09.227856 | orchestrator | 2026-04-07 02:47:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:09.231500 | orchestrator | 2026-04-07 02:47:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:09.231568 | orchestrator | 2026-04-07 02:47:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:12.284745 | orchestrator | 2026-04-07 02:47:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:12.286261 | orchestrator | 2026-04-07 02:47:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:12.286312 | orchestrator | 2026-04-07 02:47:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:15.335421 | orchestrator | 2026-04-07 02:47:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:15.339134 | orchestrator | 2026-04-07 02:47:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:15.339200 | orchestrator | 2026-04-07 02:47:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:18.387954 | orchestrator | 2026-04-07 02:47:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:18.389932 | orchestrator | 2026-04-07 02:47:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:18.389984 | orchestrator | 2026-04-07 02:47:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:21.441192 | orchestrator | 2026-04-07 02:47:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:21.444218 | orchestrator | 2026-04-07 02:47:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:21.444278 | orchestrator | 2026-04-07 02:47:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:24.494738 | orchestrator | 2026-04-07 02:47:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:24.497891 | orchestrator | 2026-04-07 02:47:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:24.497978 | orchestrator | 2026-04-07 02:47:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:27.548244 | orchestrator | 2026-04-07 02:47:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:27.550217 | orchestrator | 2026-04-07 02:47:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:27.550258 | orchestrator | 2026-04-07 02:47:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:30.589205 | orchestrator | 2026-04-07 02:47:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:30.592101 | orchestrator | 2026-04-07 02:47:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:30.592175 | orchestrator | 2026-04-07 02:47:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:33.629066 | orchestrator | 2026-04-07 02:47:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:33.631185 | orchestrator | 2026-04-07 02:47:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:33.631273 | orchestrator | 2026-04-07 02:47:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:36.684863 | orchestrator | 2026-04-07 02:47:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:36.686564 | orchestrator | 2026-04-07 02:47:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:36.686623 | orchestrator | 2026-04-07 02:47:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:39.738660 | orchestrator | 2026-04-07 02:47:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:39.739784 | orchestrator | 2026-04-07 02:47:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:39.739898 | orchestrator | 2026-04-07 02:47:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:42.786415 | orchestrator | 2026-04-07 02:47:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:42.789078 | orchestrator | 2026-04-07 02:47:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:42.789151 | orchestrator | 2026-04-07 02:47:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:45.832006 | orchestrator | 2026-04-07 02:47:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:45.833719 | orchestrator | 2026-04-07 02:47:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:45.833758 | orchestrator | 2026-04-07 02:47:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:48.878391 | orchestrator | 2026-04-07 02:47:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:48.880554 | orchestrator | 2026-04-07 02:47:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:48.880789 | orchestrator | 2026-04-07 02:47:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:51.934095 | orchestrator | 2026-04-07 02:47:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:51.935512 | orchestrator | 2026-04-07 02:47:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:51.935859 | orchestrator | 2026-04-07 02:47:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:54.985796 | orchestrator | 2026-04-07 02:47:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:54.987119 | orchestrator | 2026-04-07 02:47:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:54.987184 | orchestrator | 2026-04-07 02:47:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:47:58.042333 | orchestrator | 2026-04-07 02:47:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:47:58.045530 | orchestrator | 2026-04-07 02:47:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:47:58.045669 | orchestrator | 2026-04-07 02:47:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:01.092469 | orchestrator | 2026-04-07 02:48:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:01.094514 | orchestrator | 2026-04-07 02:48:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:01.094692 | orchestrator | 2026-04-07 02:48:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:04.149141 | orchestrator | 2026-04-07 02:48:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:04.151240 | orchestrator | 2026-04-07 02:48:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:04.151328 | orchestrator | 2026-04-07 02:48:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:07.198817 | orchestrator | 2026-04-07 02:48:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:07.200032 | orchestrator | 2026-04-07 02:48:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:07.200273 | orchestrator | 2026-04-07 02:48:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:10.252117 | orchestrator | 2026-04-07 02:48:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:10.254767 | orchestrator | 2026-04-07 02:48:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:10.254834 | orchestrator | 2026-04-07 02:48:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:13.302866 | orchestrator | 2026-04-07 02:48:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:13.305105 | orchestrator | 2026-04-07 02:48:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:13.305384 | orchestrator | 2026-04-07 02:48:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:16.353966 | orchestrator | 2026-04-07 02:48:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:16.356395 | orchestrator | 2026-04-07 02:48:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:16.356456 | orchestrator | 2026-04-07 02:48:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:19.410159 | orchestrator | 2026-04-07 02:48:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:19.413089 | orchestrator | 2026-04-07 02:48:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:19.413252 | orchestrator | 2026-04-07 02:48:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:22.468877 | orchestrator | 2026-04-07 02:48:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:22.470686 | orchestrator | 2026-04-07 02:48:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:22.470734 | orchestrator | 2026-04-07 02:48:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:25.523837 | orchestrator | 2026-04-07 02:48:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:25.525973 | orchestrator | 2026-04-07 02:48:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:25.526223 | orchestrator | 2026-04-07 02:48:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:28.583298 | orchestrator | 2026-04-07 02:48:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:28.585083 | orchestrator | 2026-04-07 02:48:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:28.585142 | orchestrator | 2026-04-07 02:48:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:31.631784 | orchestrator | 2026-04-07 02:48:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:31.632536 | orchestrator | 2026-04-07 02:48:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:31.632795 | orchestrator | 2026-04-07 02:48:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:34.681143 | orchestrator | 2026-04-07 02:48:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:34.683457 | orchestrator | 2026-04-07 02:48:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:34.683513 | orchestrator | 2026-04-07 02:48:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:37.734164 | orchestrator | 2026-04-07 02:48:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:37.735786 | orchestrator | 2026-04-07 02:48:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:37.735836 | orchestrator | 2026-04-07 02:48:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:40.783053 | orchestrator | 2026-04-07 02:48:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:40.784236 | orchestrator | 2026-04-07 02:48:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:40.784293 | orchestrator | 2026-04-07 02:48:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:43.827659 | orchestrator | 2026-04-07 02:48:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:43.829841 | orchestrator | 2026-04-07 02:48:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:43.829909 | orchestrator | 2026-04-07 02:48:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:46.869762 | orchestrator | 2026-04-07 02:48:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:46.872225 | orchestrator | 2026-04-07 02:48:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:46.872310 | orchestrator | 2026-04-07 02:48:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:49.925764 | orchestrator | 2026-04-07 02:48:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:49.927307 | orchestrator | 2026-04-07 02:48:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:49.927550 | orchestrator | 2026-04-07 02:48:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:52.972993 | orchestrator | 2026-04-07 02:48:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:52.974277 | orchestrator | 2026-04-07 02:48:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:52.974326 | orchestrator | 2026-04-07 02:48:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:56.023340 | orchestrator | 2026-04-07 02:48:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:56.023780 | orchestrator | 2026-04-07 02:48:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:56.024189 | orchestrator | 2026-04-07 02:48:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:48:59.066946 | orchestrator | 2026-04-07 02:48:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:48:59.068690 | orchestrator | 2026-04-07 02:48:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:48:59.068826 | orchestrator | 2026-04-07 02:48:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:02.109558 | orchestrator | 2026-04-07 02:49:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:02.111797 | orchestrator | 2026-04-07 02:49:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:02.111868 | orchestrator | 2026-04-07 02:49:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:05.155470 | orchestrator | 2026-04-07 02:49:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:05.157922 | orchestrator | 2026-04-07 02:49:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:05.157975 | orchestrator | 2026-04-07 02:49:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:08.201409 | orchestrator | 2026-04-07 02:49:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:08.202534 | orchestrator | 2026-04-07 02:49:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:08.202687 | orchestrator | 2026-04-07 02:49:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:11.255723 | orchestrator | 2026-04-07 02:49:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:11.260056 | orchestrator | 2026-04-07 02:49:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:11.260127 | orchestrator | 2026-04-07 02:49:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:14.332425 | orchestrator | 2026-04-07 02:49:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:14.334467 | orchestrator | 2026-04-07 02:49:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:14.334697 | orchestrator | 2026-04-07 02:49:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:17.377252 | orchestrator | 2026-04-07 02:49:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:17.379995 | orchestrator | 2026-04-07 02:49:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:17.380086 | orchestrator | 2026-04-07 02:49:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:20.417533 | orchestrator | 2026-04-07 02:49:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:20.419661 | orchestrator | 2026-04-07 02:49:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:20.419802 | orchestrator | 2026-04-07 02:49:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:23.465981 | orchestrator | 2026-04-07 02:49:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:23.469128 | orchestrator | 2026-04-07 02:49:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:23.469237 | orchestrator | 2026-04-07 02:49:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:26.517117 | orchestrator | 2026-04-07 02:49:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:26.519983 | orchestrator | 2026-04-07 02:49:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:26.520054 | orchestrator | 2026-04-07 02:49:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:29.563258 | orchestrator | 2026-04-07 02:49:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:29.565131 | orchestrator | 2026-04-07 02:49:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:29.565255 | orchestrator | 2026-04-07 02:49:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:32.610823 | orchestrator | 2026-04-07 02:49:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:32.612760 | orchestrator | 2026-04-07 02:49:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:32.612790 | orchestrator | 2026-04-07 02:49:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:35.657362 | orchestrator | 2026-04-07 02:49:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:35.659591 | orchestrator | 2026-04-07 02:49:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:35.659708 | orchestrator | 2026-04-07 02:49:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:38.714559 | orchestrator | 2026-04-07 02:49:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:38.717307 | orchestrator | 2026-04-07 02:49:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:38.717379 | orchestrator | 2026-04-07 02:49:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:41.764429 | orchestrator | 2026-04-07 02:49:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:41.766116 | orchestrator | 2026-04-07 02:49:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:41.766184 | orchestrator | 2026-04-07 02:49:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:44.812173 | orchestrator | 2026-04-07 02:49:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:44.813110 | orchestrator | 2026-04-07 02:49:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:44.813151 | orchestrator | 2026-04-07 02:49:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:47.860466 | orchestrator | 2026-04-07 02:49:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:47.861978 | orchestrator | 2026-04-07 02:49:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:47.862175 | orchestrator | 2026-04-07 02:49:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:50.903592 | orchestrator | 2026-04-07 02:49:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:50.904366 | orchestrator | 2026-04-07 02:49:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:50.904411 | orchestrator | 2026-04-07 02:49:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:53.946357 | orchestrator | 2026-04-07 02:49:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:53.948815 | orchestrator | 2026-04-07 02:49:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:53.948872 | orchestrator | 2026-04-07 02:49:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:49:56.995575 | orchestrator | 2026-04-07 02:49:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:49:56.997429 | orchestrator | 2026-04-07 02:49:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:49:56.997498 | orchestrator | 2026-04-07 02:49:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:00.029583 | orchestrator | 2026-04-07 02:50:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:00.031030 | orchestrator | 2026-04-07 02:50:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:00.031061 | orchestrator | 2026-04-07 02:50:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:03.072164 | orchestrator | 2026-04-07 02:50:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:03.073284 | orchestrator | 2026-04-07 02:50:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:03.073354 | orchestrator | 2026-04-07 02:50:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:06.119740 | orchestrator | 2026-04-07 02:50:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:06.120034 | orchestrator | 2026-04-07 02:50:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:06.120098 | orchestrator | 2026-04-07 02:50:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:09.172396 | orchestrator | 2026-04-07 02:50:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:09.173737 | orchestrator | 2026-04-07 02:50:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:09.173778 | orchestrator | 2026-04-07 02:50:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:12.233095 | orchestrator | 2026-04-07 02:50:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:12.236593 | orchestrator | 2026-04-07 02:50:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:12.236705 | orchestrator | 2026-04-07 02:50:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:15.286908 | orchestrator | 2026-04-07 02:50:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:15.288260 | orchestrator | 2026-04-07 02:50:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:15.288317 | orchestrator | 2026-04-07 02:50:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:18.336903 | orchestrator | 2026-04-07 02:50:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:18.340267 | orchestrator | 2026-04-07 02:50:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:18.340352 | orchestrator | 2026-04-07 02:50:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:21.390242 | orchestrator | 2026-04-07 02:50:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:21.392419 | orchestrator | 2026-04-07 02:50:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:21.392472 | orchestrator | 2026-04-07 02:50:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:24.448496 | orchestrator | 2026-04-07 02:50:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:24.450802 | orchestrator | 2026-04-07 02:50:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:24.451198 | orchestrator | 2026-04-07 02:50:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:27.500466 | orchestrator | 2026-04-07 02:50:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:27.502116 | orchestrator | 2026-04-07 02:50:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:27.502155 | orchestrator | 2026-04-07 02:50:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:30.549346 | orchestrator | 2026-04-07 02:50:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:30.552042 | orchestrator | 2026-04-07 02:50:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:30.552089 | orchestrator | 2026-04-07 02:50:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:33.606139 | orchestrator | 2026-04-07 02:50:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:33.607822 | orchestrator | 2026-04-07 02:50:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:33.607885 | orchestrator | 2026-04-07 02:50:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:36.654068 | orchestrator | 2026-04-07 02:50:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:36.655815 | orchestrator | 2026-04-07 02:50:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:36.655868 | orchestrator | 2026-04-07 02:50:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:39.715229 | orchestrator | 2026-04-07 02:50:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:39.717196 | orchestrator | 2026-04-07 02:50:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:39.717254 | orchestrator | 2026-04-07 02:50:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:42.775117 | orchestrator | 2026-04-07 02:50:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:42.777668 | orchestrator | 2026-04-07 02:50:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:42.777764 | orchestrator | 2026-04-07 02:50:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:45.825504 | orchestrator | 2026-04-07 02:50:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:45.829887 | orchestrator | 2026-04-07 02:50:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:45.829960 | orchestrator | 2026-04-07 02:50:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:48.879353 | orchestrator | 2026-04-07 02:50:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:48.880817 | orchestrator | 2026-04-07 02:50:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:48.880890 | orchestrator | 2026-04-07 02:50:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:51.934316 | orchestrator | 2026-04-07 02:50:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:51.935980 | orchestrator | 2026-04-07 02:50:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:51.936031 | orchestrator | 2026-04-07 02:50:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:54.981297 | orchestrator | 2026-04-07 02:50:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:54.982995 | orchestrator | 2026-04-07 02:50:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:54.983078 | orchestrator | 2026-04-07 02:50:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:50:58.031449 | orchestrator | 2026-04-07 02:50:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:50:58.033307 | orchestrator | 2026-04-07 02:50:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:50:58.033354 | orchestrator | 2026-04-07 02:50:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:01.079046 | orchestrator | 2026-04-07 02:51:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:01.079873 | orchestrator | 2026-04-07 02:51:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:01.079912 | orchestrator | 2026-04-07 02:51:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:04.130936 | orchestrator | 2026-04-07 02:51:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:04.133116 | orchestrator | 2026-04-07 02:51:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:04.133537 | orchestrator | 2026-04-07 02:51:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:07.183565 | orchestrator | 2026-04-07 02:51:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:07.186109 | orchestrator | 2026-04-07 02:51:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:07.186215 | orchestrator | 2026-04-07 02:51:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:10.234084 | orchestrator | 2026-04-07 02:51:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:10.235927 | orchestrator | 2026-04-07 02:51:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:10.235997 | orchestrator | 2026-04-07 02:51:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:13.287144 | orchestrator | 2026-04-07 02:51:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:13.288184 | orchestrator | 2026-04-07 02:51:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:13.288221 | orchestrator | 2026-04-07 02:51:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:16.334380 | orchestrator | 2026-04-07 02:51:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:16.337368 | orchestrator | 2026-04-07 02:51:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:16.337416 | orchestrator | 2026-04-07 02:51:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:19.385364 | orchestrator | 2026-04-07 02:51:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:19.387024 | orchestrator | 2026-04-07 02:51:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:19.387064 | orchestrator | 2026-04-07 02:51:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:22.438359 | orchestrator | 2026-04-07 02:51:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:22.440319 | orchestrator | 2026-04-07 02:51:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:22.440393 | orchestrator | 2026-04-07 02:51:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:25.492923 | orchestrator | 2026-04-07 02:51:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:25.495818 | orchestrator | 2026-04-07 02:51:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:25.495886 | orchestrator | 2026-04-07 02:51:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:28.536414 | orchestrator | 2026-04-07 02:51:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:28.537708 | orchestrator | 2026-04-07 02:51:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:28.537758 | orchestrator | 2026-04-07 02:51:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:31.592556 | orchestrator | 2026-04-07 02:51:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:31.595586 | orchestrator | 2026-04-07 02:51:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:31.595728 | orchestrator | 2026-04-07 02:51:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:34.644997 | orchestrator | 2026-04-07 02:51:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:34.646771 | orchestrator | 2026-04-07 02:51:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:34.646827 | orchestrator | 2026-04-07 02:51:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:37.697168 | orchestrator | 2026-04-07 02:51:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:37.699823 | orchestrator | 2026-04-07 02:51:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:37.699849 | orchestrator | 2026-04-07 02:51:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:40.749863 | orchestrator | 2026-04-07 02:51:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:40.751892 | orchestrator | 2026-04-07 02:51:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:40.752039 | orchestrator | 2026-04-07 02:51:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:43.801783 | orchestrator | 2026-04-07 02:51:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:43.802937 | orchestrator | 2026-04-07 02:51:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:43.803310 | orchestrator | 2026-04-07 02:51:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:46.847174 | orchestrator | 2026-04-07 02:51:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:46.848801 | orchestrator | 2026-04-07 02:51:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:46.848866 | orchestrator | 2026-04-07 02:51:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:49.896073 | orchestrator | 2026-04-07 02:51:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:49.898063 | orchestrator | 2026-04-07 02:51:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:49.898167 | orchestrator | 2026-04-07 02:51:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:52.943189 | orchestrator | 2026-04-07 02:51:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:52.946393 | orchestrator | 2026-04-07 02:51:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:52.946492 | orchestrator | 2026-04-07 02:51:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:55.995846 | orchestrator | 2026-04-07 02:51:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:55.998250 | orchestrator | 2026-04-07 02:51:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:55.998301 | orchestrator | 2026-04-07 02:51:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:51:59.049139 | orchestrator | 2026-04-07 02:51:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:51:59.050342 | orchestrator | 2026-04-07 02:51:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:51:59.050420 | orchestrator | 2026-04-07 02:51:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:02.104580 | orchestrator | 2026-04-07 02:52:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:02.107790 | orchestrator | 2026-04-07 02:52:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:02.107850 | orchestrator | 2026-04-07 02:52:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:05.150468 | orchestrator | 2026-04-07 02:52:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:05.152882 | orchestrator | 2026-04-07 02:52:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:05.152954 | orchestrator | 2026-04-07 02:52:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:08.198959 | orchestrator | 2026-04-07 02:52:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:08.201056 | orchestrator | 2026-04-07 02:52:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:08.201117 | orchestrator | 2026-04-07 02:52:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:11.251518 | orchestrator | 2026-04-07 02:52:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:11.252805 | orchestrator | 2026-04-07 02:52:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:11.252976 | orchestrator | 2026-04-07 02:52:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:14.306470 | orchestrator | 2026-04-07 02:52:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:14.307281 | orchestrator | 2026-04-07 02:52:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:14.307362 | orchestrator | 2026-04-07 02:52:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:17.363435 | orchestrator | 2026-04-07 02:52:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:17.365013 | orchestrator | 2026-04-07 02:52:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:17.365091 | orchestrator | 2026-04-07 02:52:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:20.418210 | orchestrator | 2026-04-07 02:52:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:20.420452 | orchestrator | 2026-04-07 02:52:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:20.420758 | orchestrator | 2026-04-07 02:52:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:23.467514 | orchestrator | 2026-04-07 02:52:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:23.469571 | orchestrator | 2026-04-07 02:52:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:23.470116 | orchestrator | 2026-04-07 02:52:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:26.511078 | orchestrator | 2026-04-07 02:52:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:26.511411 | orchestrator | 2026-04-07 02:52:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:26.511428 | orchestrator | 2026-04-07 02:52:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:29.554968 | orchestrator | 2026-04-07 02:52:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:29.556860 | orchestrator | 2026-04-07 02:52:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:29.556914 | orchestrator | 2026-04-07 02:52:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:32.606972 | orchestrator | 2026-04-07 02:52:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:32.608649 | orchestrator | 2026-04-07 02:52:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:32.609002 | orchestrator | 2026-04-07 02:52:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:35.660843 | orchestrator | 2026-04-07 02:52:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:35.661033 | orchestrator | 2026-04-07 02:52:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:35.661057 | orchestrator | 2026-04-07 02:52:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:38.706543 | orchestrator | 2026-04-07 02:52:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:38.708363 | orchestrator | 2026-04-07 02:52:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:38.708565 | orchestrator | 2026-04-07 02:52:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:41.760211 | orchestrator | 2026-04-07 02:52:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:41.761925 | orchestrator | 2026-04-07 02:52:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:41.762131 | orchestrator | 2026-04-07 02:52:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:44.807015 | orchestrator | 2026-04-07 02:52:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:44.810788 | orchestrator | 2026-04-07 02:52:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:44.810903 | orchestrator | 2026-04-07 02:52:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:47.859273 | orchestrator | 2026-04-07 02:52:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:47.861208 | orchestrator | 2026-04-07 02:52:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:47.861257 | orchestrator | 2026-04-07 02:52:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:50.908695 | orchestrator | 2026-04-07 02:52:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:50.910723 | orchestrator | 2026-04-07 02:52:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:50.910764 | orchestrator | 2026-04-07 02:52:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:53.961659 | orchestrator | 2026-04-07 02:52:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:53.962338 | orchestrator | 2026-04-07 02:52:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:53.962374 | orchestrator | 2026-04-07 02:52:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:52:57.002639 | orchestrator | 2026-04-07 02:52:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:52:57.005975 | orchestrator | 2026-04-07 02:52:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:52:57.006057 | orchestrator | 2026-04-07 02:52:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:00.046125 | orchestrator | 2026-04-07 02:53:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:00.047967 | orchestrator | 2026-04-07 02:53:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:00.048022 | orchestrator | 2026-04-07 02:53:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:03.092073 | orchestrator | 2026-04-07 02:53:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:03.093448 | orchestrator | 2026-04-07 02:53:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:03.093752 | orchestrator | 2026-04-07 02:53:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:06.137026 | orchestrator | 2026-04-07 02:53:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:06.137654 | orchestrator | 2026-04-07 02:53:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:06.137704 | orchestrator | 2026-04-07 02:53:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:09.183665 | orchestrator | 2026-04-07 02:53:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:09.185990 | orchestrator | 2026-04-07 02:53:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:09.186116 | orchestrator | 2026-04-07 02:53:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:12.236313 | orchestrator | 2026-04-07 02:53:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:12.238217 | orchestrator | 2026-04-07 02:53:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:12.238278 | orchestrator | 2026-04-07 02:53:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:15.289315 | orchestrator | 2026-04-07 02:53:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:15.290993 | orchestrator | 2026-04-07 02:53:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:15.291053 | orchestrator | 2026-04-07 02:53:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:18.337838 | orchestrator | 2026-04-07 02:53:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:18.339118 | orchestrator | 2026-04-07 02:53:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:18.339167 | orchestrator | 2026-04-07 02:53:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:21.388444 | orchestrator | 2026-04-07 02:53:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:21.390134 | orchestrator | 2026-04-07 02:53:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:21.390235 | orchestrator | 2026-04-07 02:53:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:24.436092 | orchestrator | 2026-04-07 02:53:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:24.437425 | orchestrator | 2026-04-07 02:53:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:24.437637 | orchestrator | 2026-04-07 02:53:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:27.483301 | orchestrator | 2026-04-07 02:53:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:27.485373 | orchestrator | 2026-04-07 02:53:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:27.485438 | orchestrator | 2026-04-07 02:53:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:30.521892 | orchestrator | 2026-04-07 02:53:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:30.523232 | orchestrator | 2026-04-07 02:53:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:30.523284 | orchestrator | 2026-04-07 02:53:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:33.566908 | orchestrator | 2026-04-07 02:53:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:33.567480 | orchestrator | 2026-04-07 02:53:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:33.567528 | orchestrator | 2026-04-07 02:53:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:36.602171 | orchestrator | 2026-04-07 02:53:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:36.602754 | orchestrator | 2026-04-07 02:53:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:36.602793 | orchestrator | 2026-04-07 02:53:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:39.658142 | orchestrator | 2026-04-07 02:53:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:39.659205 | orchestrator | 2026-04-07 02:53:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:39.659244 | orchestrator | 2026-04-07 02:53:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:42.704987 | orchestrator | 2026-04-07 02:53:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:42.705737 | orchestrator | 2026-04-07 02:53:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:42.705778 | orchestrator | 2026-04-07 02:53:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:45.756084 | orchestrator | 2026-04-07 02:53:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:45.757804 | orchestrator | 2026-04-07 02:53:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:45.757864 | orchestrator | 2026-04-07 02:53:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:48.806466 | orchestrator | 2026-04-07 02:53:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:48.809618 | orchestrator | 2026-04-07 02:53:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:48.809763 | orchestrator | 2026-04-07 02:53:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:51.859479 | orchestrator | 2026-04-07 02:53:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:51.862107 | orchestrator | 2026-04-07 02:53:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:51.862220 | orchestrator | 2026-04-07 02:53:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:54.909676 | orchestrator | 2026-04-07 02:53:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:54.913247 | orchestrator | 2026-04-07 02:53:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:54.913326 | orchestrator | 2026-04-07 02:53:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:53:57.965803 | orchestrator | 2026-04-07 02:53:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:53:57.968262 | orchestrator | 2026-04-07 02:53:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:53:57.968388 | orchestrator | 2026-04-07 02:53:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:01.026103 | orchestrator | 2026-04-07 02:54:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:01.029355 | orchestrator | 2026-04-07 02:54:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:01.029422 | orchestrator | 2026-04-07 02:54:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:04.077210 | orchestrator | 2026-04-07 02:54:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:04.079066 | orchestrator | 2026-04-07 02:54:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:04.079141 | orchestrator | 2026-04-07 02:54:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:07.123040 | orchestrator | 2026-04-07 02:54:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:07.124255 | orchestrator | 2026-04-07 02:54:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:07.124330 | orchestrator | 2026-04-07 02:54:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:10.155763 | orchestrator | 2026-04-07 02:54:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:10.157615 | orchestrator | 2026-04-07 02:54:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:10.157677 | orchestrator | 2026-04-07 02:54:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:13.205946 | orchestrator | 2026-04-07 02:54:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:13.207408 | orchestrator | 2026-04-07 02:54:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:13.207450 | orchestrator | 2026-04-07 02:54:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:16.253150 | orchestrator | 2026-04-07 02:54:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:16.254319 | orchestrator | 2026-04-07 02:54:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:16.254386 | orchestrator | 2026-04-07 02:54:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:19.296812 | orchestrator | 2026-04-07 02:54:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:19.298951 | orchestrator | 2026-04-07 02:54:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:19.299052 | orchestrator | 2026-04-07 02:54:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:22.342735 | orchestrator | 2026-04-07 02:54:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:22.344921 | orchestrator | 2026-04-07 02:54:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:22.344963 | orchestrator | 2026-04-07 02:54:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:25.392505 | orchestrator | 2026-04-07 02:54:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:25.394344 | orchestrator | 2026-04-07 02:54:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:25.394398 | orchestrator | 2026-04-07 02:54:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:28.438304 | orchestrator | 2026-04-07 02:54:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:28.440585 | orchestrator | 2026-04-07 02:54:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:28.440641 | orchestrator | 2026-04-07 02:54:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:31.479032 | orchestrator | 2026-04-07 02:54:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:31.480027 | orchestrator | 2026-04-07 02:54:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:31.480109 | orchestrator | 2026-04-07 02:54:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:34.537059 | orchestrator | 2026-04-07 02:54:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:34.539615 | orchestrator | 2026-04-07 02:54:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:34.539665 | orchestrator | 2026-04-07 02:54:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:37.587992 | orchestrator | 2026-04-07 02:54:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:37.591243 | orchestrator | 2026-04-07 02:54:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:37.591317 | orchestrator | 2026-04-07 02:54:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:40.642860 | orchestrator | 2026-04-07 02:54:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:40.644691 | orchestrator | 2026-04-07 02:54:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:40.644852 | orchestrator | 2026-04-07 02:54:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:43.691684 | orchestrator | 2026-04-07 02:54:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:43.693098 | orchestrator | 2026-04-07 02:54:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:43.693135 | orchestrator | 2026-04-07 02:54:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:46.736779 | orchestrator | 2026-04-07 02:54:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:46.738128 | orchestrator | 2026-04-07 02:54:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:46.738180 | orchestrator | 2026-04-07 02:54:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:49.781304 | orchestrator | 2026-04-07 02:54:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:49.782513 | orchestrator | 2026-04-07 02:54:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:49.782575 | orchestrator | 2026-04-07 02:54:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:52.829510 | orchestrator | 2026-04-07 02:54:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:52.831798 | orchestrator | 2026-04-07 02:54:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:52.832374 | orchestrator | 2026-04-07 02:54:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:55.876180 | orchestrator | 2026-04-07 02:54:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:55.878865 | orchestrator | 2026-04-07 02:54:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:55.878930 | orchestrator | 2026-04-07 02:54:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:54:58.928929 | orchestrator | 2026-04-07 02:54:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:54:58.931176 | orchestrator | 2026-04-07 02:54:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:54:58.931225 | orchestrator | 2026-04-07 02:54:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:01.981247 | orchestrator | 2026-04-07 02:55:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:01.982006 | orchestrator | 2026-04-07 02:55:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:01.982167 | orchestrator | 2026-04-07 02:55:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:05.043216 | orchestrator | 2026-04-07 02:55:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:05.044773 | orchestrator | 2026-04-07 02:55:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:05.044838 | orchestrator | 2026-04-07 02:55:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:08.094553 | orchestrator | 2026-04-07 02:55:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:08.096646 | orchestrator | 2026-04-07 02:55:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:08.096749 | orchestrator | 2026-04-07 02:55:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:11.149104 | orchestrator | 2026-04-07 02:55:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:11.150641 | orchestrator | 2026-04-07 02:55:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:11.150732 | orchestrator | 2026-04-07 02:55:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:14.198139 | orchestrator | 2026-04-07 02:55:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:14.200549 | orchestrator | 2026-04-07 02:55:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:14.200637 | orchestrator | 2026-04-07 02:55:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:17.251138 | orchestrator | 2026-04-07 02:55:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:17.252980 | orchestrator | 2026-04-07 02:55:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:17.253058 | orchestrator | 2026-04-07 02:55:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:20.308277 | orchestrator | 2026-04-07 02:55:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:20.309809 | orchestrator | 2026-04-07 02:55:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:20.309877 | orchestrator | 2026-04-07 02:55:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:23.354577 | orchestrator | 2026-04-07 02:55:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:23.356162 | orchestrator | 2026-04-07 02:55:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:23.356203 | orchestrator | 2026-04-07 02:55:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:26.401458 | orchestrator | 2026-04-07 02:55:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:26.403176 | orchestrator | 2026-04-07 02:55:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:26.403250 | orchestrator | 2026-04-07 02:55:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:29.454570 | orchestrator | 2026-04-07 02:55:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:29.456204 | orchestrator | 2026-04-07 02:55:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:29.456281 | orchestrator | 2026-04-07 02:55:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:32.505537 | orchestrator | 2026-04-07 02:55:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:32.507929 | orchestrator | 2026-04-07 02:55:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:32.507998 | orchestrator | 2026-04-07 02:55:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:35.559874 | orchestrator | 2026-04-07 02:55:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:35.561189 | orchestrator | 2026-04-07 02:55:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:35.561218 | orchestrator | 2026-04-07 02:55:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:38.600481 | orchestrator | 2026-04-07 02:55:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:38.602881 | orchestrator | 2026-04-07 02:55:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:38.602945 | orchestrator | 2026-04-07 02:55:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:41.649271 | orchestrator | 2026-04-07 02:55:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:41.651117 | orchestrator | 2026-04-07 02:55:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:41.651163 | orchestrator | 2026-04-07 02:55:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:44.702287 | orchestrator | 2026-04-07 02:55:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:44.704578 | orchestrator | 2026-04-07 02:55:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:44.704659 | orchestrator | 2026-04-07 02:55:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:47.753504 | orchestrator | 2026-04-07 02:55:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:47.756343 | orchestrator | 2026-04-07 02:55:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:47.756446 | orchestrator | 2026-04-07 02:55:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:50.812589 | orchestrator | 2026-04-07 02:55:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:50.814266 | orchestrator | 2026-04-07 02:55:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:50.814293 | orchestrator | 2026-04-07 02:55:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:53.862549 | orchestrator | 2026-04-07 02:55:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:53.863442 | orchestrator | 2026-04-07 02:55:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:53.863493 | orchestrator | 2026-04-07 02:55:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:56.913084 | orchestrator | 2026-04-07 02:55:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:56.914889 | orchestrator | 2026-04-07 02:55:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:56.914932 | orchestrator | 2026-04-07 02:55:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:55:59.961879 | orchestrator | 2026-04-07 02:55:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:55:59.964455 | orchestrator | 2026-04-07 02:55:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:55:59.964511 | orchestrator | 2026-04-07 02:55:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:03.015006 | orchestrator | 2026-04-07 02:56:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:03.017627 | orchestrator | 2026-04-07 02:56:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:03.017706 | orchestrator | 2026-04-07 02:56:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:06.074322 | orchestrator | 2026-04-07 02:56:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:06.074512 | orchestrator | 2026-04-07 02:56:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:06.074534 | orchestrator | 2026-04-07 02:56:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:09.125652 | orchestrator | 2026-04-07 02:56:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:09.127854 | orchestrator | 2026-04-07 02:56:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:09.127909 | orchestrator | 2026-04-07 02:56:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:12.177326 | orchestrator | 2026-04-07 02:56:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:12.179364 | orchestrator | 2026-04-07 02:56:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:12.179406 | orchestrator | 2026-04-07 02:56:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:15.229846 | orchestrator | 2026-04-07 02:56:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:15.232321 | orchestrator | 2026-04-07 02:56:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:15.232716 | orchestrator | 2026-04-07 02:56:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:18.284112 | orchestrator | 2026-04-07 02:56:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:18.286428 | orchestrator | 2026-04-07 02:56:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:18.286488 | orchestrator | 2026-04-07 02:56:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:21.337402 | orchestrator | 2026-04-07 02:56:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:21.339898 | orchestrator | 2026-04-07 02:56:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:21.339949 | orchestrator | 2026-04-07 02:56:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:24.392858 | orchestrator | 2026-04-07 02:56:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:24.394969 | orchestrator | 2026-04-07 02:56:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:24.395032 | orchestrator | 2026-04-07 02:56:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:27.436535 | orchestrator | 2026-04-07 02:56:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:27.438555 | orchestrator | 2026-04-07 02:56:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:27.438632 | orchestrator | 2026-04-07 02:56:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:30.484267 | orchestrator | 2026-04-07 02:56:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:30.485319 | orchestrator | 2026-04-07 02:56:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:30.485350 | orchestrator | 2026-04-07 02:56:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:33.531295 | orchestrator | 2026-04-07 02:56:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:33.533495 | orchestrator | 2026-04-07 02:56:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:33.533536 | orchestrator | 2026-04-07 02:56:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:36.573141 | orchestrator | 2026-04-07 02:56:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:36.574428 | orchestrator | 2026-04-07 02:56:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:36.575190 | orchestrator | 2026-04-07 02:56:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:39.630454 | orchestrator | 2026-04-07 02:56:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:39.632972 | orchestrator | 2026-04-07 02:56:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:39.633169 | orchestrator | 2026-04-07 02:56:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:42.685246 | orchestrator | 2026-04-07 02:56:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:42.686198 | orchestrator | 2026-04-07 02:56:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:42.686259 | orchestrator | 2026-04-07 02:56:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:45.730245 | orchestrator | 2026-04-07 02:56:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:45.731699 | orchestrator | 2026-04-07 02:56:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:45.731814 | orchestrator | 2026-04-07 02:56:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:48.781597 | orchestrator | 2026-04-07 02:56:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:48.784776 | orchestrator | 2026-04-07 02:56:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:48.784827 | orchestrator | 2026-04-07 02:56:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:51.830802 | orchestrator | 2026-04-07 02:56:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:51.833079 | orchestrator | 2026-04-07 02:56:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:51.833147 | orchestrator | 2026-04-07 02:56:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:54.875520 | orchestrator | 2026-04-07 02:56:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:54.876045 | orchestrator | 2026-04-07 02:56:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:54.876118 | orchestrator | 2026-04-07 02:56:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:56:57.927479 | orchestrator | 2026-04-07 02:56:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:56:57.929828 | orchestrator | 2026-04-07 02:56:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:56:57.929868 | orchestrator | 2026-04-07 02:56:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:00.980385 | orchestrator | 2026-04-07 02:57:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:00.984290 | orchestrator | 2026-04-07 02:57:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:00.984438 | orchestrator | 2026-04-07 02:57:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:04.034568 | orchestrator | 2026-04-07 02:57:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:04.036403 | orchestrator | 2026-04-07 02:57:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:04.036498 | orchestrator | 2026-04-07 02:57:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:07.084202 | orchestrator | 2026-04-07 02:57:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:07.085078 | orchestrator | 2026-04-07 02:57:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:07.085147 | orchestrator | 2026-04-07 02:57:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:10.138398 | orchestrator | 2026-04-07 02:57:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:10.140803 | orchestrator | 2026-04-07 02:57:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:10.140858 | orchestrator | 2026-04-07 02:57:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:13.194327 | orchestrator | 2026-04-07 02:57:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:13.196908 | orchestrator | 2026-04-07 02:57:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:13.196971 | orchestrator | 2026-04-07 02:57:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:16.236643 | orchestrator | 2026-04-07 02:57:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:16.236942 | orchestrator | 2026-04-07 02:57:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:16.236973 | orchestrator | 2026-04-07 02:57:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:19.287357 | orchestrator | 2026-04-07 02:57:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:19.289066 | orchestrator | 2026-04-07 02:57:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:19.289117 | orchestrator | 2026-04-07 02:57:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:22.341535 | orchestrator | 2026-04-07 02:57:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:22.343305 | orchestrator | 2026-04-07 02:57:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:22.343355 | orchestrator | 2026-04-07 02:57:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:25.394106 | orchestrator | 2026-04-07 02:57:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:25.394942 | orchestrator | 2026-04-07 02:57:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:25.394974 | orchestrator | 2026-04-07 02:57:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:28.446694 | orchestrator | 2026-04-07 02:57:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:28.448094 | orchestrator | 2026-04-07 02:57:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:28.448196 | orchestrator | 2026-04-07 02:57:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:31.491197 | orchestrator | 2026-04-07 02:57:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:31.492800 | orchestrator | 2026-04-07 02:57:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:31.492839 | orchestrator | 2026-04-07 02:57:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:34.530637 | orchestrator | 2026-04-07 02:57:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:34.533126 | orchestrator | 2026-04-07 02:57:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:34.533189 | orchestrator | 2026-04-07 02:57:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:37.577048 | orchestrator | 2026-04-07 02:57:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:37.578355 | orchestrator | 2026-04-07 02:57:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:37.578434 | orchestrator | 2026-04-07 02:57:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:40.632438 | orchestrator | 2026-04-07 02:57:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:40.634265 | orchestrator | 2026-04-07 02:57:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:40.634323 | orchestrator | 2026-04-07 02:57:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:43.678247 | orchestrator | 2026-04-07 02:57:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:43.680592 | orchestrator | 2026-04-07 02:57:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:43.680656 | orchestrator | 2026-04-07 02:57:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:46.732919 | orchestrator | 2026-04-07 02:57:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:46.736162 | orchestrator | 2026-04-07 02:57:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:46.736216 | orchestrator | 2026-04-07 02:57:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:49.792850 | orchestrator | 2026-04-07 02:57:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:49.794407 | orchestrator | 2026-04-07 02:57:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:49.794475 | orchestrator | 2026-04-07 02:57:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:52.851100 | orchestrator | 2026-04-07 02:57:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:52.853555 | orchestrator | 2026-04-07 02:57:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:52.853630 | orchestrator | 2026-04-07 02:57:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:55.910112 | orchestrator | 2026-04-07 02:57:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:55.911861 | orchestrator | 2026-04-07 02:57:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:55.911985 | orchestrator | 2026-04-07 02:57:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:57:58.959389 | orchestrator | 2026-04-07 02:57:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:57:58.962874 | orchestrator | 2026-04-07 02:57:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:57:58.962929 | orchestrator | 2026-04-07 02:57:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:02.006349 | orchestrator | 2026-04-07 02:58:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:02.008699 | orchestrator | 2026-04-07 02:58:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:02.008738 | orchestrator | 2026-04-07 02:58:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:05.057158 | orchestrator | 2026-04-07 02:58:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:05.060426 | orchestrator | 2026-04-07 02:58:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:05.060511 | orchestrator | 2026-04-07 02:58:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:08.109376 | orchestrator | 2026-04-07 02:58:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:08.111501 | orchestrator | 2026-04-07 02:58:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:08.111569 | orchestrator | 2026-04-07 02:58:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:11.170262 | orchestrator | 2026-04-07 02:58:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:11.171574 | orchestrator | 2026-04-07 02:58:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:11.171654 | orchestrator | 2026-04-07 02:58:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:14.239364 | orchestrator | 2026-04-07 02:58:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:14.249355 | orchestrator | 2026-04-07 02:58:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:14.249446 | orchestrator | 2026-04-07 02:58:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:17.291875 | orchestrator | 2026-04-07 02:58:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:17.294496 | orchestrator | 2026-04-07 02:58:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:17.294591 | orchestrator | 2026-04-07 02:58:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:20.343043 | orchestrator | 2026-04-07 02:58:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:20.343709 | orchestrator | 2026-04-07 02:58:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:20.343750 | orchestrator | 2026-04-07 02:58:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:23.389800 | orchestrator | 2026-04-07 02:58:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:23.391387 | orchestrator | 2026-04-07 02:58:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:23.391483 | orchestrator | 2026-04-07 02:58:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:26.439237 | orchestrator | 2026-04-07 02:58:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:26.439626 | orchestrator | 2026-04-07 02:58:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:26.439662 | orchestrator | 2026-04-07 02:58:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:29.486538 | orchestrator | 2026-04-07 02:58:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:29.488347 | orchestrator | 2026-04-07 02:58:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:29.488421 | orchestrator | 2026-04-07 02:58:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:32.538221 | orchestrator | 2026-04-07 02:58:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:32.540497 | orchestrator | 2026-04-07 02:58:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:32.540543 | orchestrator | 2026-04-07 02:58:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:35.587637 | orchestrator | 2026-04-07 02:58:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:35.590160 | orchestrator | 2026-04-07 02:58:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:35.590225 | orchestrator | 2026-04-07 02:58:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:38.641016 | orchestrator | 2026-04-07 02:58:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:38.642531 | orchestrator | 2026-04-07 02:58:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:38.642596 | orchestrator | 2026-04-07 02:58:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:41.681985 | orchestrator | 2026-04-07 02:58:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:41.684246 | orchestrator | 2026-04-07 02:58:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:41.684326 | orchestrator | 2026-04-07 02:58:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:44.724360 | orchestrator | 2026-04-07 02:58:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:44.725209 | orchestrator | 2026-04-07 02:58:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:44.725257 | orchestrator | 2026-04-07 02:58:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:47.774960 | orchestrator | 2026-04-07 02:58:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:47.776696 | orchestrator | 2026-04-07 02:58:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:47.776738 | orchestrator | 2026-04-07 02:58:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:50.824384 | orchestrator | 2026-04-07 02:58:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:50.825906 | orchestrator | 2026-04-07 02:58:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:50.826263 | orchestrator | 2026-04-07 02:58:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:53.875313 | orchestrator | 2026-04-07 02:58:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:53.876573 | orchestrator | 2026-04-07 02:58:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:53.876730 | orchestrator | 2026-04-07 02:58:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:56.927727 | orchestrator | 2026-04-07 02:58:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:56.929387 | orchestrator | 2026-04-07 02:58:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:56.929458 | orchestrator | 2026-04-07 02:58:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:58:59.975822 | orchestrator | 2026-04-07 02:58:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:58:59.977500 | orchestrator | 2026-04-07 02:58:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:58:59.977724 | orchestrator | 2026-04-07 02:58:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:03.028119 | orchestrator | 2026-04-07 02:59:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:03.034237 | orchestrator | 2026-04-07 02:59:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:03.034326 | orchestrator | 2026-04-07 02:59:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:06.085291 | orchestrator | 2026-04-07 02:59:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:06.086196 | orchestrator | 2026-04-07 02:59:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:06.086288 | orchestrator | 2026-04-07 02:59:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:09.127182 | orchestrator | 2026-04-07 02:59:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:09.130331 | orchestrator | 2026-04-07 02:59:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:09.130442 | orchestrator | 2026-04-07 02:59:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:12.179705 | orchestrator | 2026-04-07 02:59:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:12.181684 | orchestrator | 2026-04-07 02:59:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:12.181822 | orchestrator | 2026-04-07 02:59:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:15.227243 | orchestrator | 2026-04-07 02:59:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:15.229377 | orchestrator | 2026-04-07 02:59:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:15.229441 | orchestrator | 2026-04-07 02:59:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:18.279972 | orchestrator | 2026-04-07 02:59:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:18.282560 | orchestrator | 2026-04-07 02:59:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:18.282632 | orchestrator | 2026-04-07 02:59:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:21.335177 | orchestrator | 2026-04-07 02:59:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:21.337423 | orchestrator | 2026-04-07 02:59:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:21.337480 | orchestrator | 2026-04-07 02:59:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:24.388921 | orchestrator | 2026-04-07 02:59:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:24.391199 | orchestrator | 2026-04-07 02:59:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:24.391275 | orchestrator | 2026-04-07 02:59:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:27.446206 | orchestrator | 2026-04-07 02:59:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:27.448684 | orchestrator | 2026-04-07 02:59:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:27.448764 | orchestrator | 2026-04-07 02:59:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:30.505190 | orchestrator | 2026-04-07 02:59:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:30.508063 | orchestrator | 2026-04-07 02:59:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:30.508131 | orchestrator | 2026-04-07 02:59:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:33.559327 | orchestrator | 2026-04-07 02:59:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:33.560301 | orchestrator | 2026-04-07 02:59:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:33.560373 | orchestrator | 2026-04-07 02:59:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:36.609029 | orchestrator | 2026-04-07 02:59:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:36.610920 | orchestrator | 2026-04-07 02:59:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:36.610967 | orchestrator | 2026-04-07 02:59:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:39.659766 | orchestrator | 2026-04-07 02:59:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:39.662699 | orchestrator | 2026-04-07 02:59:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:39.662755 | orchestrator | 2026-04-07 02:59:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:42.707969 | orchestrator | 2026-04-07 02:59:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:42.710662 | orchestrator | 2026-04-07 02:59:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:42.710737 | orchestrator | 2026-04-07 02:59:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:45.754702 | orchestrator | 2026-04-07 02:59:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:45.756459 | orchestrator | 2026-04-07 02:59:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:45.756547 | orchestrator | 2026-04-07 02:59:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:48.803901 | orchestrator | 2026-04-07 02:59:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:48.806272 | orchestrator | 2026-04-07 02:59:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:48.806332 | orchestrator | 2026-04-07 02:59:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:51.854536 | orchestrator | 2026-04-07 02:59:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:51.858135 | orchestrator | 2026-04-07 02:59:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:51.858198 | orchestrator | 2026-04-07 02:59:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:54.899627 | orchestrator | 2026-04-07 02:59:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:54.901704 | orchestrator | 2026-04-07 02:59:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:54.901838 | orchestrator | 2026-04-07 02:59:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 02:59:57.952604 | orchestrator | 2026-04-07 02:59:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 02:59:57.955149 | orchestrator | 2026-04-07 02:59:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 02:59:57.955253 | orchestrator | 2026-04-07 02:59:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:01.006663 | orchestrator | 2026-04-07 03:00:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:01.009916 | orchestrator | 2026-04-07 03:00:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:01.010327 | orchestrator | 2026-04-07 03:00:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:04.062427 | orchestrator | 2026-04-07 03:00:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:04.065151 | orchestrator | 2026-04-07 03:00:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:04.065232 | orchestrator | 2026-04-07 03:00:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:07.122396 | orchestrator | 2026-04-07 03:00:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:07.123502 | orchestrator | 2026-04-07 03:00:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:07.123562 | orchestrator | 2026-04-07 03:00:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:10.166944 | orchestrator | 2026-04-07 03:00:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:10.169284 | orchestrator | 2026-04-07 03:00:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:10.169432 | orchestrator | 2026-04-07 03:00:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:13.219024 | orchestrator | 2026-04-07 03:00:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:13.220296 | orchestrator | 2026-04-07 03:00:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:13.220337 | orchestrator | 2026-04-07 03:00:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:16.264699 | orchestrator | 2026-04-07 03:00:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:16.266256 | orchestrator | 2026-04-07 03:00:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:16.266307 | orchestrator | 2026-04-07 03:00:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:19.321626 | orchestrator | 2026-04-07 03:00:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:19.323724 | orchestrator | 2026-04-07 03:00:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:19.323772 | orchestrator | 2026-04-07 03:00:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:22.372668 | orchestrator | 2026-04-07 03:00:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:22.374242 | orchestrator | 2026-04-07 03:00:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:22.374291 | orchestrator | 2026-04-07 03:00:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:25.432465 | orchestrator | 2026-04-07 03:00:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:25.433084 | orchestrator | 2026-04-07 03:00:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:25.433143 | orchestrator | 2026-04-07 03:00:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:28.489434 | orchestrator | 2026-04-07 03:00:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:28.491602 | orchestrator | 2026-04-07 03:00:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:28.491736 | orchestrator | 2026-04-07 03:00:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:31.537641 | orchestrator | 2026-04-07 03:00:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:31.539777 | orchestrator | 2026-04-07 03:00:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:31.539898 | orchestrator | 2026-04-07 03:00:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:34.590869 | orchestrator | 2026-04-07 03:00:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:34.593576 | orchestrator | 2026-04-07 03:00:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:34.593627 | orchestrator | 2026-04-07 03:00:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:37.643265 | orchestrator | 2026-04-07 03:00:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:37.644190 | orchestrator | 2026-04-07 03:00:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:37.644257 | orchestrator | 2026-04-07 03:00:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:40.693359 | orchestrator | 2026-04-07 03:00:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:40.693785 | orchestrator | 2026-04-07 03:00:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:40.694240 | orchestrator | 2026-04-07 03:00:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:43.748153 | orchestrator | 2026-04-07 03:00:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:43.748887 | orchestrator | 2026-04-07 03:00:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:43.748933 | orchestrator | 2026-04-07 03:00:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:46.800084 | orchestrator | 2026-04-07 03:00:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:46.800334 | orchestrator | 2026-04-07 03:00:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:46.800361 | orchestrator | 2026-04-07 03:00:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:49.842982 | orchestrator | 2026-04-07 03:00:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:49.843713 | orchestrator | 2026-04-07 03:00:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:49.843754 | orchestrator | 2026-04-07 03:00:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:52.892883 | orchestrator | 2026-04-07 03:00:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:52.893913 | orchestrator | 2026-04-07 03:00:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:52.893962 | orchestrator | 2026-04-07 03:00:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:55.945292 | orchestrator | 2026-04-07 03:00:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:55.946965 | orchestrator | 2026-04-07 03:00:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:55.947039 | orchestrator | 2026-04-07 03:00:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:00:58.992708 | orchestrator | 2026-04-07 03:00:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:00:58.995954 | orchestrator | 2026-04-07 03:00:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:00:58.996033 | orchestrator | 2026-04-07 03:00:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:02.044100 | orchestrator | 2026-04-07 03:01:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:02.051376 | orchestrator | 2026-04-07 03:01:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:02.051499 | orchestrator | 2026-04-07 03:01:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:05.103517 | orchestrator | 2026-04-07 03:01:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:05.105543 | orchestrator | 2026-04-07 03:01:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:05.105599 | orchestrator | 2026-04-07 03:01:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:08.155727 | orchestrator | 2026-04-07 03:01:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:08.160156 | orchestrator | 2026-04-07 03:01:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:08.160271 | orchestrator | 2026-04-07 03:01:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:11.218495 | orchestrator | 2026-04-07 03:01:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:11.219168 | orchestrator | 2026-04-07 03:01:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:11.219201 | orchestrator | 2026-04-07 03:01:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:14.273383 | orchestrator | 2026-04-07 03:01:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:14.276225 | orchestrator | 2026-04-07 03:01:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:14.276289 | orchestrator | 2026-04-07 03:01:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:17.328421 | orchestrator | 2026-04-07 03:01:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:17.329749 | orchestrator | 2026-04-07 03:01:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:17.329969 | orchestrator | 2026-04-07 03:01:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:20.373231 | orchestrator | 2026-04-07 03:01:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:20.374302 | orchestrator | 2026-04-07 03:01:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:20.374453 | orchestrator | 2026-04-07 03:01:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:23.418111 | orchestrator | 2026-04-07 03:01:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:23.418787 | orchestrator | 2026-04-07 03:01:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:23.418861 | orchestrator | 2026-04-07 03:01:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:26.467612 | orchestrator | 2026-04-07 03:01:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:26.468914 | orchestrator | 2026-04-07 03:01:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:26.468999 | orchestrator | 2026-04-07 03:01:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:29.527582 | orchestrator | 2026-04-07 03:01:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:29.530003 | orchestrator | 2026-04-07 03:01:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:29.530165 | orchestrator | 2026-04-07 03:01:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:32.577378 | orchestrator | 2026-04-07 03:01:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:32.578999 | orchestrator | 2026-04-07 03:01:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:32.579058 | orchestrator | 2026-04-07 03:01:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:35.634887 | orchestrator | 2026-04-07 03:01:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:35.637704 | orchestrator | 2026-04-07 03:01:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:35.637742 | orchestrator | 2026-04-07 03:01:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:38.686894 | orchestrator | 2026-04-07 03:01:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:38.689311 | orchestrator | 2026-04-07 03:01:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:38.689336 | orchestrator | 2026-04-07 03:01:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:41.733060 | orchestrator | 2026-04-07 03:01:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:41.735292 | orchestrator | 2026-04-07 03:01:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:41.735349 | orchestrator | 2026-04-07 03:01:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:44.792177 | orchestrator | 2026-04-07 03:01:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:44.794446 | orchestrator | 2026-04-07 03:01:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:44.794523 | orchestrator | 2026-04-07 03:01:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:47.846919 | orchestrator | 2026-04-07 03:01:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:47.848600 | orchestrator | 2026-04-07 03:01:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:47.848937 | orchestrator | 2026-04-07 03:01:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:50.899854 | orchestrator | 2026-04-07 03:01:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:50.902626 | orchestrator | 2026-04-07 03:01:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:50.902712 | orchestrator | 2026-04-07 03:01:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:53.955991 | orchestrator | 2026-04-07 03:01:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:53.956378 | orchestrator | 2026-04-07 03:01:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:53.956426 | orchestrator | 2026-04-07 03:01:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:01:57.008681 | orchestrator | 2026-04-07 03:01:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:01:57.010395 | orchestrator | 2026-04-07 03:01:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:01:57.010461 | orchestrator | 2026-04-07 03:01:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:00.049759 | orchestrator | 2026-04-07 03:02:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:00.050086 | orchestrator | 2026-04-07 03:02:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:00.050122 | orchestrator | 2026-04-07 03:02:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:03.100142 | orchestrator | 2026-04-07 03:02:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:03.102234 | orchestrator | 2026-04-07 03:02:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:03.102286 | orchestrator | 2026-04-07 03:02:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:06.151532 | orchestrator | 2026-04-07 03:02:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:06.153047 | orchestrator | 2026-04-07 03:02:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:06.153111 | orchestrator | 2026-04-07 03:02:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:09.204904 | orchestrator | 2026-04-07 03:02:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:09.207699 | orchestrator | 2026-04-07 03:02:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:09.207756 | orchestrator | 2026-04-07 03:02:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:12.253046 | orchestrator | 2026-04-07 03:02:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:12.253195 | orchestrator | 2026-04-07 03:02:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:12.253657 | orchestrator | 2026-04-07 03:02:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:15.300445 | orchestrator | 2026-04-07 03:02:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:15.302107 | orchestrator | 2026-04-07 03:02:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:15.302199 | orchestrator | 2026-04-07 03:02:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:18.345659 | orchestrator | 2026-04-07 03:02:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:18.346794 | orchestrator | 2026-04-07 03:02:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:18.346976 | orchestrator | 2026-04-07 03:02:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:21.394275 | orchestrator | 2026-04-07 03:02:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:21.396541 | orchestrator | 2026-04-07 03:02:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:21.396617 | orchestrator | 2026-04-07 03:02:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:24.439270 | orchestrator | 2026-04-07 03:02:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:24.443270 | orchestrator | 2026-04-07 03:02:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:24.443350 | orchestrator | 2026-04-07 03:02:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:27.491504 | orchestrator | 2026-04-07 03:02:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:27.496253 | orchestrator | 2026-04-07 03:02:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:27.496380 | orchestrator | 2026-04-07 03:02:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:30.543205 | orchestrator | 2026-04-07 03:02:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:30.543514 | orchestrator | 2026-04-07 03:02:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:30.543535 | orchestrator | 2026-04-07 03:02:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:33.592628 | orchestrator | 2026-04-07 03:02:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:33.592937 | orchestrator | 2026-04-07 03:02:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:33.592965 | orchestrator | 2026-04-07 03:02:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:36.643078 | orchestrator | 2026-04-07 03:02:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:36.643482 | orchestrator | 2026-04-07 03:02:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:36.643515 | orchestrator | 2026-04-07 03:02:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:39.689554 | orchestrator | 2026-04-07 03:02:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:39.691934 | orchestrator | 2026-04-07 03:02:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:39.692019 | orchestrator | 2026-04-07 03:02:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:42.736279 | orchestrator | 2026-04-07 03:02:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:42.736413 | orchestrator | 2026-04-07 03:02:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:42.736426 | orchestrator | 2026-04-07 03:02:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:45.791528 | orchestrator | 2026-04-07 03:02:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:45.792517 | orchestrator | 2026-04-07 03:02:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:45.792559 | orchestrator | 2026-04-07 03:02:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:48.836963 | orchestrator | 2026-04-07 03:02:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:48.837698 | orchestrator | 2026-04-07 03:02:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:48.837743 | orchestrator | 2026-04-07 03:02:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:51.873847 | orchestrator | 2026-04-07 03:02:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:51.874115 | orchestrator | 2026-04-07 03:02:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:51.874133 | orchestrator | 2026-04-07 03:02:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:54.914263 | orchestrator | 2026-04-07 03:02:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:54.917078 | orchestrator | 2026-04-07 03:02:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:54.917200 | orchestrator | 2026-04-07 03:02:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:02:57.959061 | orchestrator | 2026-04-07 03:02:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:02:57.960282 | orchestrator | 2026-04-07 03:02:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:02:57.960309 | orchestrator | 2026-04-07 03:02:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:01.008538 | orchestrator | 2026-04-07 03:03:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:01.012387 | orchestrator | 2026-04-07 03:03:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:01.012472 | orchestrator | 2026-04-07 03:03:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:04.058204 | orchestrator | 2026-04-07 03:03:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:04.059903 | orchestrator | 2026-04-07 03:03:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:04.059999 | orchestrator | 2026-04-07 03:03:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:07.099671 | orchestrator | 2026-04-07 03:03:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:07.100131 | orchestrator | 2026-04-07 03:03:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:07.100175 | orchestrator | 2026-04-07 03:03:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:10.148475 | orchestrator | 2026-04-07 03:03:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:10.149014 | orchestrator | 2026-04-07 03:03:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:10.149036 | orchestrator | 2026-04-07 03:03:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:13.194729 | orchestrator | 2026-04-07 03:03:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:13.195724 | orchestrator | 2026-04-07 03:03:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:13.195753 | orchestrator | 2026-04-07 03:03:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:16.252665 | orchestrator | 2026-04-07 03:03:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:16.254316 | orchestrator | 2026-04-07 03:03:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:16.254381 | orchestrator | 2026-04-07 03:03:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:19.301771 | orchestrator | 2026-04-07 03:03:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:19.303889 | orchestrator | 2026-04-07 03:03:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:19.303964 | orchestrator | 2026-04-07 03:03:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:22.354659 | orchestrator | 2026-04-07 03:03:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:22.355950 | orchestrator | 2026-04-07 03:03:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:22.356004 | orchestrator | 2026-04-07 03:03:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:25.401405 | orchestrator | 2026-04-07 03:03:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:25.404484 | orchestrator | 2026-04-07 03:03:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:25.404571 | orchestrator | 2026-04-07 03:03:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:28.458948 | orchestrator | 2026-04-07 03:03:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:28.459171 | orchestrator | 2026-04-07 03:03:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:28.459199 | orchestrator | 2026-04-07 03:03:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:31.510283 | orchestrator | 2026-04-07 03:03:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:31.511970 | orchestrator | 2026-04-07 03:03:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:31.512031 | orchestrator | 2026-04-07 03:03:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:34.559935 | orchestrator | 2026-04-07 03:03:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:34.561372 | orchestrator | 2026-04-07 03:03:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:34.561408 | orchestrator | 2026-04-07 03:03:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:37.605341 | orchestrator | 2026-04-07 03:03:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:37.606134 | orchestrator | 2026-04-07 03:03:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:37.606174 | orchestrator | 2026-04-07 03:03:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:40.660537 | orchestrator | 2026-04-07 03:03:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:40.661022 | orchestrator | 2026-04-07 03:03:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:40.661052 | orchestrator | 2026-04-07 03:03:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:43.705616 | orchestrator | 2026-04-07 03:03:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:43.707555 | orchestrator | 2026-04-07 03:03:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:43.707630 | orchestrator | 2026-04-07 03:03:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:46.754155 | orchestrator | 2026-04-07 03:03:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:46.755814 | orchestrator | 2026-04-07 03:03:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:46.755954 | orchestrator | 2026-04-07 03:03:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:49.797333 | orchestrator | 2026-04-07 03:03:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:49.798955 | orchestrator | 2026-04-07 03:03:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:49.799013 | orchestrator | 2026-04-07 03:03:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:52.850924 | orchestrator | 2026-04-07 03:03:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:52.853090 | orchestrator | 2026-04-07 03:03:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:52.853122 | orchestrator | 2026-04-07 03:03:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:55.903808 | orchestrator | 2026-04-07 03:03:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:55.904461 | orchestrator | 2026-04-07 03:03:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:55.904493 | orchestrator | 2026-04-07 03:03:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:03:58.950458 | orchestrator | 2026-04-07 03:03:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:03:58.951481 | orchestrator | 2026-04-07 03:03:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:03:58.951529 | orchestrator | 2026-04-07 03:03:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:02.004674 | orchestrator | 2026-04-07 03:04:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:02.009603 | orchestrator | 2026-04-07 03:04:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:02.009732 | orchestrator | 2026-04-07 03:04:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:05.061459 | orchestrator | 2026-04-07 03:04:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:05.065096 | orchestrator | 2026-04-07 03:04:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:05.065162 | orchestrator | 2026-04-07 03:04:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:08.114726 | orchestrator | 2026-04-07 03:04:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:08.115326 | orchestrator | 2026-04-07 03:04:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:08.115360 | orchestrator | 2026-04-07 03:04:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:11.174186 | orchestrator | 2026-04-07 03:04:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:11.174965 | orchestrator | 2026-04-07 03:04:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:11.175014 | orchestrator | 2026-04-07 03:04:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:14.229078 | orchestrator | 2026-04-07 03:04:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:14.234161 | orchestrator | 2026-04-07 03:04:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:14.234404 | orchestrator | 2026-04-07 03:04:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:17.291222 | orchestrator | 2026-04-07 03:04:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:17.292938 | orchestrator | 2026-04-07 03:04:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:17.293017 | orchestrator | 2026-04-07 03:04:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:20.348810 | orchestrator | 2026-04-07 03:04:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:20.349134 | orchestrator | 2026-04-07 03:04:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:20.349191 | orchestrator | 2026-04-07 03:04:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:23.397724 | orchestrator | 2026-04-07 03:04:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:23.398548 | orchestrator | 2026-04-07 03:04:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:23.398677 | orchestrator | 2026-04-07 03:04:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:26.446346 | orchestrator | 2026-04-07 03:04:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:26.448146 | orchestrator | 2026-04-07 03:04:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:26.448222 | orchestrator | 2026-04-07 03:04:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:29.495589 | orchestrator | 2026-04-07 03:04:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:29.497883 | orchestrator | 2026-04-07 03:04:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:29.497922 | orchestrator | 2026-04-07 03:04:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:32.548661 | orchestrator | 2026-04-07 03:04:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:32.551930 | orchestrator | 2026-04-07 03:04:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:32.552243 | orchestrator | 2026-04-07 03:04:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:35.602677 | orchestrator | 2026-04-07 03:04:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:35.604122 | orchestrator | 2026-04-07 03:04:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:35.604175 | orchestrator | 2026-04-07 03:04:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:38.646935 | orchestrator | 2026-04-07 03:04:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:38.647071 | orchestrator | 2026-04-07 03:04:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:38.647088 | orchestrator | 2026-04-07 03:04:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:41.700666 | orchestrator | 2026-04-07 03:04:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:41.702704 | orchestrator | 2026-04-07 03:04:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:41.702768 | orchestrator | 2026-04-07 03:04:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:44.752811 | orchestrator | 2026-04-07 03:04:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:44.756307 | orchestrator | 2026-04-07 03:04:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:44.756386 | orchestrator | 2026-04-07 03:04:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:47.801052 | orchestrator | 2026-04-07 03:04:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:47.802577 | orchestrator | 2026-04-07 03:04:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:47.802627 | orchestrator | 2026-04-07 03:04:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:50.850153 | orchestrator | 2026-04-07 03:04:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:50.851169 | orchestrator | 2026-04-07 03:04:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:50.851211 | orchestrator | 2026-04-07 03:04:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:53.893037 | orchestrator | 2026-04-07 03:04:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:53.894792 | orchestrator | 2026-04-07 03:04:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:53.894991 | orchestrator | 2026-04-07 03:04:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:56.943289 | orchestrator | 2026-04-07 03:04:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:56.943810 | orchestrator | 2026-04-07 03:04:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:56.943892 | orchestrator | 2026-04-07 03:04:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:04:59.987284 | orchestrator | 2026-04-07 03:04:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:04:59.987453 | orchestrator | 2026-04-07 03:04:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:04:59.987476 | orchestrator | 2026-04-07 03:04:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:03.038739 | orchestrator | 2026-04-07 03:05:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:03.040331 | orchestrator | 2026-04-07 03:05:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:03.040370 | orchestrator | 2026-04-07 03:05:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:06.086570 | orchestrator | 2026-04-07 03:05:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:06.087267 | orchestrator | 2026-04-07 03:05:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:06.087681 | orchestrator | 2026-04-07 03:05:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:09.133195 | orchestrator | 2026-04-07 03:05:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:09.133968 | orchestrator | 2026-04-07 03:05:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:09.134013 | orchestrator | 2026-04-07 03:05:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:12.182397 | orchestrator | 2026-04-07 03:05:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:12.182695 | orchestrator | 2026-04-07 03:05:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:12.183001 | orchestrator | 2026-04-07 03:05:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:15.234645 | orchestrator | 2026-04-07 03:05:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:15.235645 | orchestrator | 2026-04-07 03:05:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:15.235678 | orchestrator | 2026-04-07 03:05:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:18.279690 | orchestrator | 2026-04-07 03:05:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:18.282561 | orchestrator | 2026-04-07 03:05:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:18.282684 | orchestrator | 2026-04-07 03:05:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:21.333470 | orchestrator | 2026-04-07 03:05:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:21.335123 | orchestrator | 2026-04-07 03:05:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:21.335409 | orchestrator | 2026-04-07 03:05:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:24.386879 | orchestrator | 2026-04-07 03:05:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:24.389242 | orchestrator | 2026-04-07 03:05:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:24.389300 | orchestrator | 2026-04-07 03:05:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:27.434056 | orchestrator | 2026-04-07 03:05:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:27.436252 | orchestrator | 2026-04-07 03:05:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:27.436283 | orchestrator | 2026-04-07 03:05:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:30.487082 | orchestrator | 2026-04-07 03:05:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:30.488967 | orchestrator | 2026-04-07 03:05:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:30.489008 | orchestrator | 2026-04-07 03:05:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:33.535473 | orchestrator | 2026-04-07 03:05:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:33.536671 | orchestrator | 2026-04-07 03:05:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:33.536724 | orchestrator | 2026-04-07 03:05:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:36.589107 | orchestrator | 2026-04-07 03:05:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:36.590305 | orchestrator | 2026-04-07 03:05:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:36.590346 | orchestrator | 2026-04-07 03:05:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:39.642105 | orchestrator | 2026-04-07 03:05:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:39.643459 | orchestrator | 2026-04-07 03:05:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:39.643684 | orchestrator | 2026-04-07 03:05:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:42.691263 | orchestrator | 2026-04-07 03:05:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:42.692357 | orchestrator | 2026-04-07 03:05:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:42.692403 | orchestrator | 2026-04-07 03:05:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:45.748627 | orchestrator | 2026-04-07 03:05:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:45.750994 | orchestrator | 2026-04-07 03:05:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:45.751153 | orchestrator | 2026-04-07 03:05:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:48.793140 | orchestrator | 2026-04-07 03:05:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:48.793717 | orchestrator | 2026-04-07 03:05:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:48.793888 | orchestrator | 2026-04-07 03:05:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:51.849666 | orchestrator | 2026-04-07 03:05:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:51.851161 | orchestrator | 2026-04-07 03:05:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:51.851251 | orchestrator | 2026-04-07 03:05:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:54.901101 | orchestrator | 2026-04-07 03:05:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:54.903301 | orchestrator | 2026-04-07 03:05:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:54.903411 | orchestrator | 2026-04-07 03:05:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:05:57.957982 | orchestrator | 2026-04-07 03:05:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:05:57.959480 | orchestrator | 2026-04-07 03:05:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:05:57.959637 | orchestrator | 2026-04-07 03:05:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:01.009220 | orchestrator | 2026-04-07 03:06:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:01.012377 | orchestrator | 2026-04-07 03:06:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:01.012465 | orchestrator | 2026-04-07 03:06:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:04.054442 | orchestrator | 2026-04-07 03:06:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:04.056030 | orchestrator | 2026-04-07 03:06:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:04.056223 | orchestrator | 2026-04-07 03:06:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:07.112467 | orchestrator | 2026-04-07 03:06:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:07.113082 | orchestrator | 2026-04-07 03:06:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:07.113128 | orchestrator | 2026-04-07 03:06:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:10.161829 | orchestrator | 2026-04-07 03:06:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:10.163788 | orchestrator | 2026-04-07 03:06:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:10.163862 | orchestrator | 2026-04-07 03:06:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:13.211570 | orchestrator | 2026-04-07 03:06:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:13.215083 | orchestrator | 2026-04-07 03:06:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:13.215251 | orchestrator | 2026-04-07 03:06:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:16.264225 | orchestrator | 2026-04-07 03:06:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:16.265688 | orchestrator | 2026-04-07 03:06:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:16.265992 | orchestrator | 2026-04-07 03:06:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:19.311200 | orchestrator | 2026-04-07 03:06:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:19.313602 | orchestrator | 2026-04-07 03:06:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:19.313675 | orchestrator | 2026-04-07 03:06:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:22.368430 | orchestrator | 2026-04-07 03:06:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:22.369405 | orchestrator | 2026-04-07 03:06:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:22.369592 | orchestrator | 2026-04-07 03:06:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:25.419615 | orchestrator | 2026-04-07 03:06:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:25.422587 | orchestrator | 2026-04-07 03:06:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:25.422671 | orchestrator | 2026-04-07 03:06:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:28.462965 | orchestrator | 2026-04-07 03:06:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:28.464797 | orchestrator | 2026-04-07 03:06:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:28.465001 | orchestrator | 2026-04-07 03:06:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:31.505103 | orchestrator | 2026-04-07 03:06:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:31.506980 | orchestrator | 2026-04-07 03:06:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:31.507034 | orchestrator | 2026-04-07 03:06:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:34.554292 | orchestrator | 2026-04-07 03:06:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:34.555294 | orchestrator | 2026-04-07 03:06:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:34.555332 | orchestrator | 2026-04-07 03:06:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:37.608222 | orchestrator | 2026-04-07 03:06:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:37.609068 | orchestrator | 2026-04-07 03:06:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:37.609245 | orchestrator | 2026-04-07 03:06:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:40.652574 | orchestrator | 2026-04-07 03:06:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:40.653727 | orchestrator | 2026-04-07 03:06:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:40.653780 | orchestrator | 2026-04-07 03:06:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:06:43.693286 | orchestrator | 2026-04-07 03:06:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:06:43.695335 | orchestrator | 2026-04-07 03:06:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:06:43.695438 | orchestrator | 2026-04-07 03:06:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:08:46.856356 | orchestrator | 2026-04-07 03:08:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:08:46.856505 | orchestrator | 2026-04-07 03:08:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:08:46.856590 | orchestrator | 2026-04-07 03:08:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:08:49.903273 | orchestrator | 2026-04-07 03:08:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:08:49.905994 | orchestrator | 2026-04-07 03:08:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:08:49.906133 | orchestrator | 2026-04-07 03:08:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:08:52.957953 | orchestrator | 2026-04-07 03:08:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:08:52.960176 | orchestrator | 2026-04-07 03:08:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:08:52.960334 | orchestrator | 2026-04-07 03:08:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:08:56.015901 | orchestrator | 2026-04-07 03:08:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:08:56.015985 | orchestrator | 2026-04-07 03:08:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:08:56.015995 | orchestrator | 2026-04-07 03:08:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:08:59.056903 | orchestrator | 2026-04-07 03:08:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:08:59.057489 | orchestrator | 2026-04-07 03:08:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:08:59.057540 | orchestrator | 2026-04-07 03:08:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:02.106587 | orchestrator | 2026-04-07 03:09:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:02.108671 | orchestrator | 2026-04-07 03:09:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:02.108713 | orchestrator | 2026-04-07 03:09:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:05.158523 | orchestrator | 2026-04-07 03:09:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:05.161316 | orchestrator | 2026-04-07 03:09:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:05.161485 | orchestrator | 2026-04-07 03:09:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:08.212753 | orchestrator | 2026-04-07 03:09:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:08.216270 | orchestrator | 2026-04-07 03:09:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:08.216343 | orchestrator | 2026-04-07 03:09:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:11.270177 | orchestrator | 2026-04-07 03:09:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:11.270756 | orchestrator | 2026-04-07 03:09:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:11.270794 | orchestrator | 2026-04-07 03:09:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:14.313203 | orchestrator | 2026-04-07 03:09:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:14.314608 | orchestrator | 2026-04-07 03:09:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:14.314691 | orchestrator | 2026-04-07 03:09:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:17.355270 | orchestrator | 2026-04-07 03:09:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:17.356511 | orchestrator | 2026-04-07 03:09:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:17.356555 | orchestrator | 2026-04-07 03:09:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:20.408989 | orchestrator | 2026-04-07 03:09:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:20.410091 | orchestrator | 2026-04-07 03:09:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:20.410146 | orchestrator | 2026-04-07 03:09:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:23.451599 | orchestrator | 2026-04-07 03:09:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:23.453116 | orchestrator | 2026-04-07 03:09:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:23.453147 | orchestrator | 2026-04-07 03:09:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:26.499879 | orchestrator | 2026-04-07 03:09:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:26.501046 | orchestrator | 2026-04-07 03:09:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:26.501075 | orchestrator | 2026-04-07 03:09:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:29.542182 | orchestrator | 2026-04-07 03:09:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:29.542936 | orchestrator | 2026-04-07 03:09:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:29.542988 | orchestrator | 2026-04-07 03:09:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:32.587452 | orchestrator | 2026-04-07 03:09:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:32.589309 | orchestrator | 2026-04-07 03:09:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:32.589387 | orchestrator | 2026-04-07 03:09:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:35.643441 | orchestrator | 2026-04-07 03:09:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:35.645386 | orchestrator | 2026-04-07 03:09:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:35.645442 | orchestrator | 2026-04-07 03:09:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:38.705968 | orchestrator | 2026-04-07 03:09:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:38.706995 | orchestrator | 2026-04-07 03:09:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:38.707064 | orchestrator | 2026-04-07 03:09:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:41.748270 | orchestrator | 2026-04-07 03:09:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:41.749146 | orchestrator | 2026-04-07 03:09:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:41.749176 | orchestrator | 2026-04-07 03:09:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:44.796467 | orchestrator | 2026-04-07 03:09:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:44.798340 | orchestrator | 2026-04-07 03:09:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:44.798770 | orchestrator | 2026-04-07 03:09:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:47.848112 | orchestrator | 2026-04-07 03:09:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:47.849605 | orchestrator | 2026-04-07 03:09:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:47.849667 | orchestrator | 2026-04-07 03:09:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:50.888336 | orchestrator | 2026-04-07 03:09:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:50.889556 | orchestrator | 2026-04-07 03:09:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:50.889638 | orchestrator | 2026-04-07 03:09:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:53.933899 | orchestrator | 2026-04-07 03:09:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:53.934351 | orchestrator | 2026-04-07 03:09:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:53.934443 | orchestrator | 2026-04-07 03:09:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:09:56.982484 | orchestrator | 2026-04-07 03:09:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:09:56.983848 | orchestrator | 2026-04-07 03:09:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:09:56.983910 | orchestrator | 2026-04-07 03:09:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:00.035765 | orchestrator | 2026-04-07 03:10:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:00.038595 | orchestrator | 2026-04-07 03:10:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:00.038674 | orchestrator | 2026-04-07 03:10:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:03.078330 | orchestrator | 2026-04-07 03:10:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:03.079368 | orchestrator | 2026-04-07 03:10:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:03.079403 | orchestrator | 2026-04-07 03:10:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:06.121703 | orchestrator | 2026-04-07 03:10:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:06.122723 | orchestrator | 2026-04-07 03:10:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:06.122791 | orchestrator | 2026-04-07 03:10:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:09.175373 | orchestrator | 2026-04-07 03:10:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:09.176964 | orchestrator | 2026-04-07 03:10:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:09.177048 | orchestrator | 2026-04-07 03:10:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:12.220266 | orchestrator | 2026-04-07 03:10:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:12.221023 | orchestrator | 2026-04-07 03:10:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:12.221069 | orchestrator | 2026-04-07 03:10:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:15.271071 | orchestrator | 2026-04-07 03:10:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:15.272868 | orchestrator | 2026-04-07 03:10:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:15.272984 | orchestrator | 2026-04-07 03:10:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:18.317324 | orchestrator | 2026-04-07 03:10:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:18.320345 | orchestrator | 2026-04-07 03:10:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:18.320399 | orchestrator | 2026-04-07 03:10:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:21.362566 | orchestrator | 2026-04-07 03:10:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:21.364054 | orchestrator | 2026-04-07 03:10:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:21.364236 | orchestrator | 2026-04-07 03:10:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:24.414692 | orchestrator | 2026-04-07 03:10:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:24.416395 | orchestrator | 2026-04-07 03:10:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:24.416459 | orchestrator | 2026-04-07 03:10:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:27.459823 | orchestrator | 2026-04-07 03:10:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:27.462653 | orchestrator | 2026-04-07 03:10:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:27.462720 | orchestrator | 2026-04-07 03:10:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:30.508728 | orchestrator | 2026-04-07 03:10:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:30.509669 | orchestrator | 2026-04-07 03:10:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:30.509719 | orchestrator | 2026-04-07 03:10:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:33.553203 | orchestrator | 2026-04-07 03:10:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:33.554342 | orchestrator | 2026-04-07 03:10:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:33.554503 | orchestrator | 2026-04-07 03:10:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:36.604427 | orchestrator | 2026-04-07 03:10:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:36.605375 | orchestrator | 2026-04-07 03:10:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:36.605415 | orchestrator | 2026-04-07 03:10:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:39.643865 | orchestrator | 2026-04-07 03:10:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:39.644858 | orchestrator | 2026-04-07 03:10:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:39.644887 | orchestrator | 2026-04-07 03:10:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:42.680189 | orchestrator | 2026-04-07 03:10:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:42.682950 | orchestrator | 2026-04-07 03:10:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:42.683025 | orchestrator | 2026-04-07 03:10:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:45.729906 | orchestrator | 2026-04-07 03:10:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:45.734118 | orchestrator | 2026-04-07 03:10:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:45.734226 | orchestrator | 2026-04-07 03:10:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:48.777453 | orchestrator | 2026-04-07 03:10:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:48.778664 | orchestrator | 2026-04-07 03:10:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:48.778700 | orchestrator | 2026-04-07 03:10:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:51.823270 | orchestrator | 2026-04-07 03:10:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:51.826115 | orchestrator | 2026-04-07 03:10:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:51.826162 | orchestrator | 2026-04-07 03:10:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:54.873162 | orchestrator | 2026-04-07 03:10:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:54.874700 | orchestrator | 2026-04-07 03:10:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:54.874775 | orchestrator | 2026-04-07 03:10:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:10:57.919434 | orchestrator | 2026-04-07 03:10:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:10:57.921648 | orchestrator | 2026-04-07 03:10:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:10:57.921852 | orchestrator | 2026-04-07 03:10:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:00.967162 | orchestrator | 2026-04-07 03:11:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:00.969229 | orchestrator | 2026-04-07 03:11:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:00.969275 | orchestrator | 2026-04-07 03:11:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:04.009178 | orchestrator | 2026-04-07 03:11:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:04.009744 | orchestrator | 2026-04-07 03:11:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:04.009777 | orchestrator | 2026-04-07 03:11:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:07.055413 | orchestrator | 2026-04-07 03:11:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:07.056555 | orchestrator | 2026-04-07 03:11:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:07.056609 | orchestrator | 2026-04-07 03:11:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:10.101389 | orchestrator | 2026-04-07 03:11:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:10.102927 | orchestrator | 2026-04-07 03:11:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:10.103025 | orchestrator | 2026-04-07 03:11:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:13.151271 | orchestrator | 2026-04-07 03:11:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:13.152714 | orchestrator | 2026-04-07 03:11:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:13.152788 | orchestrator | 2026-04-07 03:11:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:16.187111 | orchestrator | 2026-04-07 03:11:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:16.187365 | orchestrator | 2026-04-07 03:11:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:16.187393 | orchestrator | 2026-04-07 03:11:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:19.225654 | orchestrator | 2026-04-07 03:11:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:19.226281 | orchestrator | 2026-04-07 03:11:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:19.226316 | orchestrator | 2026-04-07 03:11:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:22.265049 | orchestrator | 2026-04-07 03:11:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:22.266190 | orchestrator | 2026-04-07 03:11:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:22.266287 | orchestrator | 2026-04-07 03:11:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:25.314832 | orchestrator | 2026-04-07 03:11:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:25.318607 | orchestrator | 2026-04-07 03:11:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:25.318680 | orchestrator | 2026-04-07 03:11:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:28.365571 | orchestrator | 2026-04-07 03:11:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:28.367689 | orchestrator | 2026-04-07 03:11:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:28.367759 | orchestrator | 2026-04-07 03:11:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:31.419350 | orchestrator | 2026-04-07 03:11:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:31.421592 | orchestrator | 2026-04-07 03:11:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:31.421655 | orchestrator | 2026-04-07 03:11:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:34.464780 | orchestrator | 2026-04-07 03:11:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:34.465792 | orchestrator | 2026-04-07 03:11:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:34.465932 | orchestrator | 2026-04-07 03:11:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:37.515447 | orchestrator | 2026-04-07 03:11:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:37.517488 | orchestrator | 2026-04-07 03:11:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:37.517569 | orchestrator | 2026-04-07 03:11:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:40.564908 | orchestrator | 2026-04-07 03:11:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:40.566130 | orchestrator | 2026-04-07 03:11:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:40.566174 | orchestrator | 2026-04-07 03:11:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:43.603356 | orchestrator | 2026-04-07 03:11:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:43.605869 | orchestrator | 2026-04-07 03:11:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:43.605958 | orchestrator | 2026-04-07 03:11:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:46.655534 | orchestrator | 2026-04-07 03:11:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:46.657560 | orchestrator | 2026-04-07 03:11:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:46.657615 | orchestrator | 2026-04-07 03:11:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:49.703124 | orchestrator | 2026-04-07 03:11:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:49.704639 | orchestrator | 2026-04-07 03:11:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:49.704682 | orchestrator | 2026-04-07 03:11:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:52.753779 | orchestrator | 2026-04-07 03:11:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:52.755410 | orchestrator | 2026-04-07 03:11:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:52.755467 | orchestrator | 2026-04-07 03:11:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:55.803760 | orchestrator | 2026-04-07 03:11:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:55.804667 | orchestrator | 2026-04-07 03:11:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:55.804761 | orchestrator | 2026-04-07 03:11:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:11:58.843572 | orchestrator | 2026-04-07 03:11:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:11:58.846253 | orchestrator | 2026-04-07 03:11:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:11:58.846327 | orchestrator | 2026-04-07 03:11:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:01.898920 | orchestrator | 2026-04-07 03:12:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:01.899243 | orchestrator | 2026-04-07 03:12:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:01.899286 | orchestrator | 2026-04-07 03:12:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:04.951288 | orchestrator | 2026-04-07 03:12:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:04.952325 | orchestrator | 2026-04-07 03:12:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:04.952476 | orchestrator | 2026-04-07 03:12:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:07.997151 | orchestrator | 2026-04-07 03:12:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:07.997602 | orchestrator | 2026-04-07 03:12:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:07.997641 | orchestrator | 2026-04-07 03:12:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:11.046277 | orchestrator | 2026-04-07 03:12:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:11.048070 | orchestrator | 2026-04-07 03:12:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:11.048134 | orchestrator | 2026-04-07 03:12:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:14.083991 | orchestrator | 2026-04-07 03:12:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:14.086177 | orchestrator | 2026-04-07 03:12:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:14.086232 | orchestrator | 2026-04-07 03:12:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:17.134167 | orchestrator | 2026-04-07 03:12:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:17.135629 | orchestrator | 2026-04-07 03:12:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:17.135680 | orchestrator | 2026-04-07 03:12:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:20.182622 | orchestrator | 2026-04-07 03:12:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:20.184969 | orchestrator | 2026-04-07 03:12:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:20.185183 | orchestrator | 2026-04-07 03:12:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:23.236464 | orchestrator | 2026-04-07 03:12:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:23.238626 | orchestrator | 2026-04-07 03:12:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:23.238707 | orchestrator | 2026-04-07 03:12:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:26.290441 | orchestrator | 2026-04-07 03:12:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:26.292821 | orchestrator | 2026-04-07 03:12:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:26.292872 | orchestrator | 2026-04-07 03:12:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:29.333916 | orchestrator | 2026-04-07 03:12:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:29.335562 | orchestrator | 2026-04-07 03:12:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:29.335627 | orchestrator | 2026-04-07 03:12:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:32.384198 | orchestrator | 2026-04-07 03:12:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:32.385744 | orchestrator | 2026-04-07 03:12:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:32.386246 | orchestrator | 2026-04-07 03:12:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:35.435287 | orchestrator | 2026-04-07 03:12:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:35.436963 | orchestrator | 2026-04-07 03:12:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:35.437019 | orchestrator | 2026-04-07 03:12:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:38.486842 | orchestrator | 2026-04-07 03:12:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:38.488415 | orchestrator | 2026-04-07 03:12:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:38.488477 | orchestrator | 2026-04-07 03:12:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:41.537544 | orchestrator | 2026-04-07 03:12:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:41.539104 | orchestrator | 2026-04-07 03:12:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:41.539157 | orchestrator | 2026-04-07 03:12:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:44.585104 | orchestrator | 2026-04-07 03:12:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:44.586507 | orchestrator | 2026-04-07 03:12:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:44.586562 | orchestrator | 2026-04-07 03:12:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:47.632851 | orchestrator | 2026-04-07 03:12:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:47.634101 | orchestrator | 2026-04-07 03:12:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:47.634171 | orchestrator | 2026-04-07 03:12:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:50.674707 | orchestrator | 2026-04-07 03:12:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:50.677528 | orchestrator | 2026-04-07 03:12:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:50.677622 | orchestrator | 2026-04-07 03:12:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:53.725272 | orchestrator | 2026-04-07 03:12:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:53.726886 | orchestrator | 2026-04-07 03:12:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:53.726935 | orchestrator | 2026-04-07 03:12:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:56.774723 | orchestrator | 2026-04-07 03:12:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:56.776347 | orchestrator | 2026-04-07 03:12:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:56.776398 | orchestrator | 2026-04-07 03:12:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:12:59.818690 | orchestrator | 2026-04-07 03:12:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:12:59.821071 | orchestrator | 2026-04-07 03:12:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:12:59.821153 | orchestrator | 2026-04-07 03:12:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:02.864933 | orchestrator | 2026-04-07 03:13:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:02.865115 | orchestrator | 2026-04-07 03:13:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:02.865136 | orchestrator | 2026-04-07 03:13:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:05.911249 | orchestrator | 2026-04-07 03:13:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:05.911484 | orchestrator | 2026-04-07 03:13:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:05.911520 | orchestrator | 2026-04-07 03:13:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:08.955037 | orchestrator | 2026-04-07 03:13:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:08.955849 | orchestrator | 2026-04-07 03:13:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:08.955895 | orchestrator | 2026-04-07 03:13:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:12.004164 | orchestrator | 2026-04-07 03:13:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:12.009201 | orchestrator | 2026-04-07 03:13:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:12.009283 | orchestrator | 2026-04-07 03:13:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:15.059000 | orchestrator | 2026-04-07 03:13:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:15.063741 | orchestrator | 2026-04-07 03:13:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:15.063848 | orchestrator | 2026-04-07 03:13:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:18.109938 | orchestrator | 2026-04-07 03:13:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:18.111279 | orchestrator | 2026-04-07 03:13:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:18.111330 | orchestrator | 2026-04-07 03:13:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:21.164463 | orchestrator | 2026-04-07 03:13:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:21.166211 | orchestrator | 2026-04-07 03:13:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:21.166265 | orchestrator | 2026-04-07 03:13:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:24.214358 | orchestrator | 2026-04-07 03:13:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:24.216230 | orchestrator | 2026-04-07 03:13:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:24.216280 | orchestrator | 2026-04-07 03:13:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:27.268169 | orchestrator | 2026-04-07 03:13:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:27.269657 | orchestrator | 2026-04-07 03:13:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:27.269687 | orchestrator | 2026-04-07 03:13:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:30.311948 | orchestrator | 2026-04-07 03:13:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:30.313164 | orchestrator | 2026-04-07 03:13:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:30.313393 | orchestrator | 2026-04-07 03:13:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:33.355436 | orchestrator | 2026-04-07 03:13:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:33.356511 | orchestrator | 2026-04-07 03:13:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:33.356538 | orchestrator | 2026-04-07 03:13:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:36.399522 | orchestrator | 2026-04-07 03:13:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:36.400854 | orchestrator | 2026-04-07 03:13:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:36.400886 | orchestrator | 2026-04-07 03:13:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:39.444914 | orchestrator | 2026-04-07 03:13:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:39.445774 | orchestrator | 2026-04-07 03:13:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:39.445827 | orchestrator | 2026-04-07 03:13:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:42.486665 | orchestrator | 2026-04-07 03:13:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:42.488307 | orchestrator | 2026-04-07 03:13:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:42.488346 | orchestrator | 2026-04-07 03:13:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:45.534231 | orchestrator | 2026-04-07 03:13:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:45.535737 | orchestrator | 2026-04-07 03:13:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:45.535873 | orchestrator | 2026-04-07 03:13:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:48.606250 | orchestrator | 2026-04-07 03:13:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:48.607310 | orchestrator | 2026-04-07 03:13:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:48.607396 | orchestrator | 2026-04-07 03:13:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:51.652326 | orchestrator | 2026-04-07 03:13:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:51.653828 | orchestrator | 2026-04-07 03:13:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:51.653909 | orchestrator | 2026-04-07 03:13:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:54.699889 | orchestrator | 2026-04-07 03:13:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:54.702351 | orchestrator | 2026-04-07 03:13:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:54.702509 | orchestrator | 2026-04-07 03:13:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:13:57.743955 | orchestrator | 2026-04-07 03:13:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:13:57.746274 | orchestrator | 2026-04-07 03:13:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:13:57.746331 | orchestrator | 2026-04-07 03:13:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:00.788605 | orchestrator | 2026-04-07 03:14:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:00.789691 | orchestrator | 2026-04-07 03:14:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:00.789780 | orchestrator | 2026-04-07 03:14:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:03.830105 | orchestrator | 2026-04-07 03:14:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:03.832497 | orchestrator | 2026-04-07 03:14:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:03.832576 | orchestrator | 2026-04-07 03:14:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:06.876129 | orchestrator | 2026-04-07 03:14:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:06.878632 | orchestrator | 2026-04-07 03:14:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:06.878687 | orchestrator | 2026-04-07 03:14:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:09.926128 | orchestrator | 2026-04-07 03:14:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:09.928330 | orchestrator | 2026-04-07 03:14:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:09.928999 | orchestrator | 2026-04-07 03:14:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:12.980221 | orchestrator | 2026-04-07 03:14:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:12.984324 | orchestrator | 2026-04-07 03:14:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:12.984515 | orchestrator | 2026-04-07 03:14:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:16.027636 | orchestrator | 2026-04-07 03:14:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:16.027954 | orchestrator | 2026-04-07 03:14:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:16.028030 | orchestrator | 2026-04-07 03:14:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:19.072527 | orchestrator | 2026-04-07 03:14:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:19.075395 | orchestrator | 2026-04-07 03:14:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:19.075496 | orchestrator | 2026-04-07 03:14:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:22.121853 | orchestrator | 2026-04-07 03:14:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:22.122683 | orchestrator | 2026-04-07 03:14:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:22.122729 | orchestrator | 2026-04-07 03:14:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:25.172295 | orchestrator | 2026-04-07 03:14:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:25.174863 | orchestrator | 2026-04-07 03:14:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:25.174918 | orchestrator | 2026-04-07 03:14:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:28.216969 | orchestrator | 2026-04-07 03:14:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:28.218074 | orchestrator | 2026-04-07 03:14:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:28.218491 | orchestrator | 2026-04-07 03:14:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:31.264460 | orchestrator | 2026-04-07 03:14:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:31.265988 | orchestrator | 2026-04-07 03:14:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:31.266458 | orchestrator | 2026-04-07 03:14:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:34.314765 | orchestrator | 2026-04-07 03:14:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:34.316088 | orchestrator | 2026-04-07 03:14:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:34.316159 | orchestrator | 2026-04-07 03:14:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:37.369183 | orchestrator | 2026-04-07 03:14:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:37.370244 | orchestrator | 2026-04-07 03:14:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:37.370379 | orchestrator | 2026-04-07 03:14:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:40.409422 | orchestrator | 2026-04-07 03:14:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:40.410701 | orchestrator | 2026-04-07 03:14:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:40.410781 | orchestrator | 2026-04-07 03:14:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:43.458683 | orchestrator | 2026-04-07 03:14:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:43.461594 | orchestrator | 2026-04-07 03:14:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:43.461648 | orchestrator | 2026-04-07 03:14:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:46.510734 | orchestrator | 2026-04-07 03:14:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:46.512593 | orchestrator | 2026-04-07 03:14:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:46.512678 | orchestrator | 2026-04-07 03:14:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:49.557498 | orchestrator | 2026-04-07 03:14:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:49.559214 | orchestrator | 2026-04-07 03:14:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:49.559301 | orchestrator | 2026-04-07 03:14:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:52.607587 | orchestrator | 2026-04-07 03:14:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:52.609203 | orchestrator | 2026-04-07 03:14:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:52.609307 | orchestrator | 2026-04-07 03:14:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:55.653848 | orchestrator | 2026-04-07 03:14:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:55.655867 | orchestrator | 2026-04-07 03:14:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:55.655925 | orchestrator | 2026-04-07 03:14:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:14:58.702984 | orchestrator | 2026-04-07 03:14:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:14:58.705056 | orchestrator | 2026-04-07 03:14:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:14:58.705114 | orchestrator | 2026-04-07 03:14:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:01.750791 | orchestrator | 2026-04-07 03:15:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:01.751262 | orchestrator | 2026-04-07 03:15:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:01.751294 | orchestrator | 2026-04-07 03:15:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:04.800309 | orchestrator | 2026-04-07 03:15:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:04.802416 | orchestrator | 2026-04-07 03:15:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:04.802488 | orchestrator | 2026-04-07 03:15:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:07.848679 | orchestrator | 2026-04-07 03:15:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:07.850371 | orchestrator | 2026-04-07 03:15:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:07.850522 | orchestrator | 2026-04-07 03:15:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:10.898875 | orchestrator | 2026-04-07 03:15:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:10.900904 | orchestrator | 2026-04-07 03:15:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:10.900941 | orchestrator | 2026-04-07 03:15:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:13.950537 | orchestrator | 2026-04-07 03:15:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:13.951938 | orchestrator | 2026-04-07 03:15:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:13.951995 | orchestrator | 2026-04-07 03:15:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:16.995365 | orchestrator | 2026-04-07 03:15:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:16.996243 | orchestrator | 2026-04-07 03:15:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:16.996325 | orchestrator | 2026-04-07 03:15:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:20.043079 | orchestrator | 2026-04-07 03:15:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:20.045109 | orchestrator | 2026-04-07 03:15:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:20.045182 | orchestrator | 2026-04-07 03:15:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:23.089466 | orchestrator | 2026-04-07 03:15:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:23.090881 | orchestrator | 2026-04-07 03:15:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:23.090940 | orchestrator | 2026-04-07 03:15:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:26.133513 | orchestrator | 2026-04-07 03:15:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:26.134846 | orchestrator | 2026-04-07 03:15:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:26.134940 | orchestrator | 2026-04-07 03:15:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:29.182089 | orchestrator | 2026-04-07 03:15:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:29.183733 | orchestrator | 2026-04-07 03:15:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:29.183766 | orchestrator | 2026-04-07 03:15:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:32.235376 | orchestrator | 2026-04-07 03:15:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:32.238711 | orchestrator | 2026-04-07 03:15:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:32.238757 | orchestrator | 2026-04-07 03:15:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:35.293958 | orchestrator | 2026-04-07 03:15:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:35.295797 | orchestrator | 2026-04-07 03:15:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:35.295822 | orchestrator | 2026-04-07 03:15:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:38.335859 | orchestrator | 2026-04-07 03:15:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:38.337481 | orchestrator | 2026-04-07 03:15:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:38.337557 | orchestrator | 2026-04-07 03:15:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:41.385771 | orchestrator | 2026-04-07 03:15:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:41.387260 | orchestrator | 2026-04-07 03:15:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:41.387309 | orchestrator | 2026-04-07 03:15:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:44.436505 | orchestrator | 2026-04-07 03:15:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:44.438133 | orchestrator | 2026-04-07 03:15:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:44.438218 | orchestrator | 2026-04-07 03:15:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:47.490628 | orchestrator | 2026-04-07 03:15:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:47.491849 | orchestrator | 2026-04-07 03:15:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:47.491887 | orchestrator | 2026-04-07 03:15:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:50.549634 | orchestrator | 2026-04-07 03:15:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:50.555132 | orchestrator | 2026-04-07 03:15:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:50.555200 | orchestrator | 2026-04-07 03:15:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:53.592463 | orchestrator | 2026-04-07 03:15:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:53.592579 | orchestrator | 2026-04-07 03:15:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:53.592599 | orchestrator | 2026-04-07 03:15:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:56.638684 | orchestrator | 2026-04-07 03:15:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:56.639924 | orchestrator | 2026-04-07 03:15:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:56.639989 | orchestrator | 2026-04-07 03:15:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:15:59.682581 | orchestrator | 2026-04-07 03:15:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:15:59.683267 | orchestrator | 2026-04-07 03:15:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:15:59.683310 | orchestrator | 2026-04-07 03:15:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:02.738519 | orchestrator | 2026-04-07 03:16:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:02.740279 | orchestrator | 2026-04-07 03:16:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:02.740300 | orchestrator | 2026-04-07 03:16:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:05.782347 | orchestrator | 2026-04-07 03:16:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:05.782483 | orchestrator | 2026-04-07 03:16:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:05.782498 | orchestrator | 2026-04-07 03:16:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:08.834616 | orchestrator | 2026-04-07 03:16:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:08.834921 | orchestrator | 2026-04-07 03:16:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:08.834951 | orchestrator | 2026-04-07 03:16:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:11.887791 | orchestrator | 2026-04-07 03:16:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:11.890287 | orchestrator | 2026-04-07 03:16:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:11.890353 | orchestrator | 2026-04-07 03:16:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:14.935879 | orchestrator | 2026-04-07 03:16:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:14.937375 | orchestrator | 2026-04-07 03:16:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:14.937469 | orchestrator | 2026-04-07 03:16:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:17.986675 | orchestrator | 2026-04-07 03:16:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:17.988635 | orchestrator | 2026-04-07 03:16:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:17.988730 | orchestrator | 2026-04-07 03:16:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:21.042586 | orchestrator | 2026-04-07 03:16:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:21.043955 | orchestrator | 2026-04-07 03:16:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:21.044291 | orchestrator | 2026-04-07 03:16:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:24.095670 | orchestrator | 2026-04-07 03:16:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:24.098144 | orchestrator | 2026-04-07 03:16:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:24.098224 | orchestrator | 2026-04-07 03:16:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:27.146758 | orchestrator | 2026-04-07 03:16:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:27.147771 | orchestrator | 2026-04-07 03:16:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:27.147913 | orchestrator | 2026-04-07 03:16:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:30.190588 | orchestrator | 2026-04-07 03:16:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:30.191590 | orchestrator | 2026-04-07 03:16:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:30.191620 | orchestrator | 2026-04-07 03:16:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:33.237044 | orchestrator | 2026-04-07 03:16:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:33.237756 | orchestrator | 2026-04-07 03:16:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:33.237815 | orchestrator | 2026-04-07 03:16:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:36.291885 | orchestrator | 2026-04-07 03:16:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:36.292380 | orchestrator | 2026-04-07 03:16:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:36.292498 | orchestrator | 2026-04-07 03:16:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:39.344766 | orchestrator | 2026-04-07 03:16:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:39.345465 | orchestrator | 2026-04-07 03:16:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:39.345652 | orchestrator | 2026-04-07 03:16:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:42.389463 | orchestrator | 2026-04-07 03:16:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:42.390598 | orchestrator | 2026-04-07 03:16:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:42.390631 | orchestrator | 2026-04-07 03:16:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:45.444454 | orchestrator | 2026-04-07 03:16:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:45.446675 | orchestrator | 2026-04-07 03:16:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:45.446737 | orchestrator | 2026-04-07 03:16:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:48.494612 | orchestrator | 2026-04-07 03:16:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:48.495637 | orchestrator | 2026-04-07 03:16:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:48.495693 | orchestrator | 2026-04-07 03:16:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:51.541444 | orchestrator | 2026-04-07 03:16:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:51.543632 | orchestrator | 2026-04-07 03:16:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:51.543681 | orchestrator | 2026-04-07 03:16:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:54.598692 | orchestrator | 2026-04-07 03:16:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:54.600696 | orchestrator | 2026-04-07 03:16:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:54.600776 | orchestrator | 2026-04-07 03:16:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:16:57.650485 | orchestrator | 2026-04-07 03:16:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:16:57.652562 | orchestrator | 2026-04-07 03:16:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:16:57.652629 | orchestrator | 2026-04-07 03:16:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:00.705874 | orchestrator | 2026-04-07 03:17:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:00.707608 | orchestrator | 2026-04-07 03:17:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:00.707672 | orchestrator | 2026-04-07 03:17:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:03.754724 | orchestrator | 2026-04-07 03:17:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:03.756629 | orchestrator | 2026-04-07 03:17:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:03.756827 | orchestrator | 2026-04-07 03:17:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:06.806438 | orchestrator | 2026-04-07 03:17:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:06.809755 | orchestrator | 2026-04-07 03:17:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:06.809836 | orchestrator | 2026-04-07 03:17:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:09.854510 | orchestrator | 2026-04-07 03:17:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:09.855461 | orchestrator | 2026-04-07 03:17:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:09.855502 | orchestrator | 2026-04-07 03:17:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:12.900871 | orchestrator | 2026-04-07 03:17:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:12.903492 | orchestrator | 2026-04-07 03:17:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:12.903696 | orchestrator | 2026-04-07 03:17:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:15.955715 | orchestrator | 2026-04-07 03:17:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:15.957619 | orchestrator | 2026-04-07 03:17:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:15.957716 | orchestrator | 2026-04-07 03:17:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:19.013209 | orchestrator | 2026-04-07 03:17:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:19.015261 | orchestrator | 2026-04-07 03:17:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:19.015324 | orchestrator | 2026-04-07 03:17:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:22.061362 | orchestrator | 2026-04-07 03:17:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:22.062745 | orchestrator | 2026-04-07 03:17:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:22.062808 | orchestrator | 2026-04-07 03:17:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:25.113487 | orchestrator | 2026-04-07 03:17:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:25.114331 | orchestrator | 2026-04-07 03:17:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:25.114418 | orchestrator | 2026-04-07 03:17:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:28.169965 | orchestrator | 2026-04-07 03:17:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:28.172977 | orchestrator | 2026-04-07 03:17:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:28.173040 | orchestrator | 2026-04-07 03:17:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:31.227116 | orchestrator | 2026-04-07 03:17:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:31.227895 | orchestrator | 2026-04-07 03:17:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:31.227925 | orchestrator | 2026-04-07 03:17:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:34.280591 | orchestrator | 2026-04-07 03:17:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:34.281969 | orchestrator | 2026-04-07 03:17:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:34.282004 | orchestrator | 2026-04-07 03:17:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:37.330726 | orchestrator | 2026-04-07 03:17:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:37.331688 | orchestrator | 2026-04-07 03:17:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:37.331927 | orchestrator | 2026-04-07 03:17:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:40.373333 | orchestrator | 2026-04-07 03:17:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:40.375199 | orchestrator | 2026-04-07 03:17:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:40.375242 | orchestrator | 2026-04-07 03:17:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:43.419505 | orchestrator | 2026-04-07 03:17:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:43.420375 | orchestrator | 2026-04-07 03:17:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:43.420444 | orchestrator | 2026-04-07 03:17:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:46.471996 | orchestrator | 2026-04-07 03:17:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:46.473366 | orchestrator | 2026-04-07 03:17:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:46.473633 | orchestrator | 2026-04-07 03:17:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:49.531665 | orchestrator | 2026-04-07 03:17:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:49.532879 | orchestrator | 2026-04-07 03:17:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:49.532911 | orchestrator | 2026-04-07 03:17:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:52.585907 | orchestrator | 2026-04-07 03:17:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:52.586985 | orchestrator | 2026-04-07 03:17:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:52.587014 | orchestrator | 2026-04-07 03:17:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:55.636577 | orchestrator | 2026-04-07 03:17:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:55.637767 | orchestrator | 2026-04-07 03:17:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:55.637859 | orchestrator | 2026-04-07 03:17:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:17:58.684436 | orchestrator | 2026-04-07 03:17:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:17:58.686459 | orchestrator | 2026-04-07 03:17:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:17:58.686508 | orchestrator | 2026-04-07 03:17:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:01.740861 | orchestrator | 2026-04-07 03:18:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:01.743897 | orchestrator | 2026-04-07 03:18:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:01.743974 | orchestrator | 2026-04-07 03:18:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:04.793462 | orchestrator | 2026-04-07 03:18:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:04.794660 | orchestrator | 2026-04-07 03:18:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:04.794817 | orchestrator | 2026-04-07 03:18:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:07.842369 | orchestrator | 2026-04-07 03:18:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:07.843747 | orchestrator | 2026-04-07 03:18:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:07.843843 | orchestrator | 2026-04-07 03:18:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:10.892716 | orchestrator | 2026-04-07 03:18:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:10.893109 | orchestrator | 2026-04-07 03:18:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:10.893140 | orchestrator | 2026-04-07 03:18:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:13.938418 | orchestrator | 2026-04-07 03:18:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:13.939913 | orchestrator | 2026-04-07 03:18:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:13.939964 | orchestrator | 2026-04-07 03:18:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:16.990289 | orchestrator | 2026-04-07 03:18:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:16.990831 | orchestrator | 2026-04-07 03:18:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:16.990857 | orchestrator | 2026-04-07 03:18:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:20.036416 | orchestrator | 2026-04-07 03:18:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:20.037430 | orchestrator | 2026-04-07 03:18:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:20.037622 | orchestrator | 2026-04-07 03:18:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:23.088888 | orchestrator | 2026-04-07 03:18:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:23.090778 | orchestrator | 2026-04-07 03:18:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:23.090828 | orchestrator | 2026-04-07 03:18:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:26.137422 | orchestrator | 2026-04-07 03:18:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:26.138230 | orchestrator | 2026-04-07 03:18:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:26.138344 | orchestrator | 2026-04-07 03:18:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:29.175381 | orchestrator | 2026-04-07 03:18:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:29.177446 | orchestrator | 2026-04-07 03:18:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:29.177507 | orchestrator | 2026-04-07 03:18:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:32.221639 | orchestrator | 2026-04-07 03:18:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:32.222825 | orchestrator | 2026-04-07 03:18:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:32.222861 | orchestrator | 2026-04-07 03:18:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:35.269363 | orchestrator | 2026-04-07 03:18:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:35.270875 | orchestrator | 2026-04-07 03:18:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:35.270903 | orchestrator | 2026-04-07 03:18:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:38.312678 | orchestrator | 2026-04-07 03:18:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:38.314512 | orchestrator | 2026-04-07 03:18:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:38.314592 | orchestrator | 2026-04-07 03:18:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:41.355464 | orchestrator | 2026-04-07 03:18:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:41.356265 | orchestrator | 2026-04-07 03:18:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:41.356304 | orchestrator | 2026-04-07 03:18:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:44.407687 | orchestrator | 2026-04-07 03:18:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:44.409555 | orchestrator | 2026-04-07 03:18:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:44.409661 | orchestrator | 2026-04-07 03:18:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:47.461630 | orchestrator | 2026-04-07 03:18:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:47.463128 | orchestrator | 2026-04-07 03:18:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:47.463184 | orchestrator | 2026-04-07 03:18:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:50.520193 | orchestrator | 2026-04-07 03:18:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:50.523212 | orchestrator | 2026-04-07 03:18:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:50.523253 | orchestrator | 2026-04-07 03:18:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:53.576626 | orchestrator | 2026-04-07 03:18:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:53.579780 | orchestrator | 2026-04-07 03:18:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:53.579863 | orchestrator | 2026-04-07 03:18:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:56.629374 | orchestrator | 2026-04-07 03:18:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:56.631813 | orchestrator | 2026-04-07 03:18:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:56.631905 | orchestrator | 2026-04-07 03:18:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:18:59.677803 | orchestrator | 2026-04-07 03:18:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:18:59.679604 | orchestrator | 2026-04-07 03:18:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:18:59.679656 | orchestrator | 2026-04-07 03:18:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:02.732154 | orchestrator | 2026-04-07 03:19:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:02.732343 | orchestrator | 2026-04-07 03:19:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:02.732403 | orchestrator | 2026-04-07 03:19:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:05.782779 | orchestrator | 2026-04-07 03:19:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:05.784976 | orchestrator | 2026-04-07 03:19:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:05.785013 | orchestrator | 2026-04-07 03:19:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:08.835405 | orchestrator | 2026-04-07 03:19:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:08.837026 | orchestrator | 2026-04-07 03:19:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:08.837073 | orchestrator | 2026-04-07 03:19:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:11.887332 | orchestrator | 2026-04-07 03:19:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:11.889047 | orchestrator | 2026-04-07 03:19:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:11.889110 | orchestrator | 2026-04-07 03:19:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:14.942791 | orchestrator | 2026-04-07 03:19:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:14.944691 | orchestrator | 2026-04-07 03:19:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:14.944930 | orchestrator | 2026-04-07 03:19:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:17.998459 | orchestrator | 2026-04-07 03:19:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:18.003887 | orchestrator | 2026-04-07 03:19:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:18.003960 | orchestrator | 2026-04-07 03:19:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:21.049985 | orchestrator | 2026-04-07 03:19:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:21.052037 | orchestrator | 2026-04-07 03:19:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:21.052102 | orchestrator | 2026-04-07 03:19:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:24.095110 | orchestrator | 2026-04-07 03:19:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:24.097737 | orchestrator | 2026-04-07 03:19:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:24.097806 | orchestrator | 2026-04-07 03:19:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:27.152855 | orchestrator | 2026-04-07 03:19:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:27.155073 | orchestrator | 2026-04-07 03:19:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:27.155137 | orchestrator | 2026-04-07 03:19:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:30.197246 | orchestrator | 2026-04-07 03:19:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:30.199928 | orchestrator | 2026-04-07 03:19:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:30.200016 | orchestrator | 2026-04-07 03:19:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:33.241879 | orchestrator | 2026-04-07 03:19:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:33.243194 | orchestrator | 2026-04-07 03:19:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:33.243238 | orchestrator | 2026-04-07 03:19:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:36.289107 | orchestrator | 2026-04-07 03:19:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:36.291596 | orchestrator | 2026-04-07 03:19:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:36.292034 | orchestrator | 2026-04-07 03:19:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:39.336458 | orchestrator | 2026-04-07 03:19:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:39.338481 | orchestrator | 2026-04-07 03:19:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:39.338713 | orchestrator | 2026-04-07 03:19:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:42.384046 | orchestrator | 2026-04-07 03:19:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:42.385791 | orchestrator | 2026-04-07 03:19:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:42.385845 | orchestrator | 2026-04-07 03:19:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:45.428823 | orchestrator | 2026-04-07 03:19:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:45.429771 | orchestrator | 2026-04-07 03:19:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:45.429832 | orchestrator | 2026-04-07 03:19:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:48.483244 | orchestrator | 2026-04-07 03:19:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:48.485719 | orchestrator | 2026-04-07 03:19:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:48.485772 | orchestrator | 2026-04-07 03:19:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:51.529749 | orchestrator | 2026-04-07 03:19:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:51.531397 | orchestrator | 2026-04-07 03:19:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:51.531462 | orchestrator | 2026-04-07 03:19:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:54.585652 | orchestrator | 2026-04-07 03:19:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:54.587481 | orchestrator | 2026-04-07 03:19:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:54.587616 | orchestrator | 2026-04-07 03:19:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:19:57.635008 | orchestrator | 2026-04-07 03:19:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:19:57.635259 | orchestrator | 2026-04-07 03:19:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:19:57.636053 | orchestrator | 2026-04-07 03:19:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:00.684117 | orchestrator | 2026-04-07 03:20:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:00.686177 | orchestrator | 2026-04-07 03:20:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:00.686251 | orchestrator | 2026-04-07 03:20:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:03.728786 | orchestrator | 2026-04-07 03:20:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:03.730186 | orchestrator | 2026-04-07 03:20:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:03.730258 | orchestrator | 2026-04-07 03:20:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:06.771502 | orchestrator | 2026-04-07 03:20:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:06.772995 | orchestrator | 2026-04-07 03:20:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:06.773060 | orchestrator | 2026-04-07 03:20:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:09.825249 | orchestrator | 2026-04-07 03:20:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:09.827335 | orchestrator | 2026-04-07 03:20:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:09.827408 | orchestrator | 2026-04-07 03:20:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:12.870954 | orchestrator | 2026-04-07 03:20:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:12.872902 | orchestrator | 2026-04-07 03:20:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:12.872954 | orchestrator | 2026-04-07 03:20:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:15.920427 | orchestrator | 2026-04-07 03:20:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:15.922931 | orchestrator | 2026-04-07 03:20:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:15.923025 | orchestrator | 2026-04-07 03:20:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:18.972320 | orchestrator | 2026-04-07 03:20:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:18.973719 | orchestrator | 2026-04-07 03:20:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:18.973770 | orchestrator | 2026-04-07 03:20:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:22.022937 | orchestrator | 2026-04-07 03:20:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:22.024677 | orchestrator | 2026-04-07 03:20:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:22.024736 | orchestrator | 2026-04-07 03:20:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:25.063386 | orchestrator | 2026-04-07 03:20:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:25.065342 | orchestrator | 2026-04-07 03:20:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:25.065398 | orchestrator | 2026-04-07 03:20:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:28.114273 | orchestrator | 2026-04-07 03:20:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:28.115075 | orchestrator | 2026-04-07 03:20:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:28.115106 | orchestrator | 2026-04-07 03:20:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:31.168600 | orchestrator | 2026-04-07 03:20:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:31.170835 | orchestrator | 2026-04-07 03:20:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:31.170882 | orchestrator | 2026-04-07 03:20:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:34.216147 | orchestrator | 2026-04-07 03:20:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:34.217472 | orchestrator | 2026-04-07 03:20:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:34.217564 | orchestrator | 2026-04-07 03:20:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:37.267670 | orchestrator | 2026-04-07 03:20:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:37.268888 | orchestrator | 2026-04-07 03:20:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:37.268925 | orchestrator | 2026-04-07 03:20:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:40.317862 | orchestrator | 2026-04-07 03:20:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:40.319603 | orchestrator | 2026-04-07 03:20:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:40.319674 | orchestrator | 2026-04-07 03:20:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:43.364797 | orchestrator | 2026-04-07 03:20:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:43.366287 | orchestrator | 2026-04-07 03:20:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:43.366326 | orchestrator | 2026-04-07 03:20:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:46.416280 | orchestrator | 2026-04-07 03:20:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:46.418168 | orchestrator | 2026-04-07 03:20:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:46.418234 | orchestrator | 2026-04-07 03:20:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:49.465685 | orchestrator | 2026-04-07 03:20:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:49.465909 | orchestrator | 2026-04-07 03:20:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:49.465981 | orchestrator | 2026-04-07 03:20:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:52.515217 | orchestrator | 2026-04-07 03:20:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:52.516961 | orchestrator | 2026-04-07 03:20:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:52.517012 | orchestrator | 2026-04-07 03:20:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:55.561271 | orchestrator | 2026-04-07 03:20:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:55.563959 | orchestrator | 2026-04-07 03:20:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:55.564008 | orchestrator | 2026-04-07 03:20:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:20:58.603992 | orchestrator | 2026-04-07 03:20:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:20:58.605918 | orchestrator | 2026-04-07 03:20:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:20:58.605983 | orchestrator | 2026-04-07 03:20:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:01.650476 | orchestrator | 2026-04-07 03:21:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:01.651669 | orchestrator | 2026-04-07 03:21:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:01.651715 | orchestrator | 2026-04-07 03:21:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:04.696675 | orchestrator | 2026-04-07 03:21:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:04.698205 | orchestrator | 2026-04-07 03:21:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:04.698284 | orchestrator | 2026-04-07 03:21:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:07.743488 | orchestrator | 2026-04-07 03:21:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:07.746072 | orchestrator | 2026-04-07 03:21:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:07.746136 | orchestrator | 2026-04-07 03:21:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:10.793268 | orchestrator | 2026-04-07 03:21:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:10.795207 | orchestrator | 2026-04-07 03:21:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:10.795278 | orchestrator | 2026-04-07 03:21:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:13.844549 | orchestrator | 2026-04-07 03:21:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:13.844742 | orchestrator | 2026-04-07 03:21:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:13.844788 | orchestrator | 2026-04-07 03:21:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:16.896290 | orchestrator | 2026-04-07 03:21:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:16.898206 | orchestrator | 2026-04-07 03:21:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:16.898275 | orchestrator | 2026-04-07 03:21:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:19.946229 | orchestrator | 2026-04-07 03:21:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:19.949219 | orchestrator | 2026-04-07 03:21:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:19.949280 | orchestrator | 2026-04-07 03:21:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:22.992642 | orchestrator | 2026-04-07 03:21:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:22.993498 | orchestrator | 2026-04-07 03:21:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:22.993612 | orchestrator | 2026-04-07 03:21:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:26.037399 | orchestrator | 2026-04-07 03:21:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:26.041224 | orchestrator | 2026-04-07 03:21:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:26.041301 | orchestrator | 2026-04-07 03:21:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:29.082836 | orchestrator | 2026-04-07 03:21:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:29.084599 | orchestrator | 2026-04-07 03:21:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:29.084660 | orchestrator | 2026-04-07 03:21:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:32.135929 | orchestrator | 2026-04-07 03:21:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:32.137281 | orchestrator | 2026-04-07 03:21:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:32.137545 | orchestrator | 2026-04-07 03:21:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:35.180100 | orchestrator | 2026-04-07 03:21:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:35.182670 | orchestrator | 2026-04-07 03:21:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:35.182720 | orchestrator | 2026-04-07 03:21:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:38.226631 | orchestrator | 2026-04-07 03:21:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:38.229512 | orchestrator | 2026-04-07 03:21:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:38.229590 | orchestrator | 2026-04-07 03:21:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:41.285589 | orchestrator | 2026-04-07 03:21:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:41.287519 | orchestrator | 2026-04-07 03:21:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:41.287589 | orchestrator | 2026-04-07 03:21:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:44.335303 | orchestrator | 2026-04-07 03:21:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:44.336737 | orchestrator | 2026-04-07 03:21:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:44.336832 | orchestrator | 2026-04-07 03:21:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:47.392970 | orchestrator | 2026-04-07 03:21:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:47.394522 | orchestrator | 2026-04-07 03:21:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:47.394588 | orchestrator | 2026-04-07 03:21:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:50.452020 | orchestrator | 2026-04-07 03:21:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:50.453741 | orchestrator | 2026-04-07 03:21:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:50.453844 | orchestrator | 2026-04-07 03:21:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:53.502614 | orchestrator | 2026-04-07 03:21:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:53.504748 | orchestrator | 2026-04-07 03:21:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:53.504810 | orchestrator | 2026-04-07 03:21:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:56.545535 | orchestrator | 2026-04-07 03:21:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:56.547494 | orchestrator | 2026-04-07 03:21:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:56.547543 | orchestrator | 2026-04-07 03:21:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:21:59.595962 | orchestrator | 2026-04-07 03:21:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:21:59.597638 | orchestrator | 2026-04-07 03:21:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:21:59.597696 | orchestrator | 2026-04-07 03:21:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:02.651094 | orchestrator | 2026-04-07 03:22:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:02.653507 | orchestrator | 2026-04-07 03:22:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:02.653619 | orchestrator | 2026-04-07 03:22:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:05.703668 | orchestrator | 2026-04-07 03:22:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:05.704839 | orchestrator | 2026-04-07 03:22:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:05.704889 | orchestrator | 2026-04-07 03:22:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:08.757830 | orchestrator | 2026-04-07 03:22:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:08.761311 | orchestrator | 2026-04-07 03:22:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:08.761525 | orchestrator | 2026-04-07 03:22:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:11.801225 | orchestrator | 2026-04-07 03:22:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:11.801427 | orchestrator | 2026-04-07 03:22:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:11.801453 | orchestrator | 2026-04-07 03:22:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:14.841616 | orchestrator | 2026-04-07 03:22:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:14.842190 | orchestrator | 2026-04-07 03:22:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:14.842240 | orchestrator | 2026-04-07 03:22:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:17.887375 | orchestrator | 2026-04-07 03:22:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:17.889549 | orchestrator | 2026-04-07 03:22:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:17.889589 | orchestrator | 2026-04-07 03:22:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:20.934423 | orchestrator | 2026-04-07 03:22:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:20.936952 | orchestrator | 2026-04-07 03:22:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:20.937019 | orchestrator | 2026-04-07 03:22:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:23.982526 | orchestrator | 2026-04-07 03:22:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:23.985257 | orchestrator | 2026-04-07 03:22:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:23.985448 | orchestrator | 2026-04-07 03:22:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:27.035267 | orchestrator | 2026-04-07 03:22:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:27.036121 | orchestrator | 2026-04-07 03:22:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:27.036198 | orchestrator | 2026-04-07 03:22:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:30.093500 | orchestrator | 2026-04-07 03:22:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:30.094908 | orchestrator | 2026-04-07 03:22:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:30.094947 | orchestrator | 2026-04-07 03:22:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:33.153396 | orchestrator | 2026-04-07 03:22:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:33.157176 | orchestrator | 2026-04-07 03:22:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:33.157264 | orchestrator | 2026-04-07 03:22:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:36.210377 | orchestrator | 2026-04-07 03:22:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:36.212281 | orchestrator | 2026-04-07 03:22:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:36.212486 | orchestrator | 2026-04-07 03:22:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:39.256152 | orchestrator | 2026-04-07 03:22:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:39.258719 | orchestrator | 2026-04-07 03:22:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:39.258778 | orchestrator | 2026-04-07 03:22:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:42.308850 | orchestrator | 2026-04-07 03:22:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:42.310253 | orchestrator | 2026-04-07 03:22:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:42.310356 | orchestrator | 2026-04-07 03:22:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:45.352747 | orchestrator | 2026-04-07 03:22:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:45.353383 | orchestrator | 2026-04-07 03:22:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:45.353418 | orchestrator | 2026-04-07 03:22:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:48.400035 | orchestrator | 2026-04-07 03:22:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:48.402222 | orchestrator | 2026-04-07 03:22:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:48.402321 | orchestrator | 2026-04-07 03:22:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:51.451633 | orchestrator | 2026-04-07 03:22:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:51.453057 | orchestrator | 2026-04-07 03:22:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:51.453076 | orchestrator | 2026-04-07 03:22:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:54.505227 | orchestrator | 2026-04-07 03:22:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:54.506965 | orchestrator | 2026-04-07 03:22:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:54.507024 | orchestrator | 2026-04-07 03:22:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:22:57.555973 | orchestrator | 2026-04-07 03:22:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:22:57.558364 | orchestrator | 2026-04-07 03:22:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:22:57.558482 | orchestrator | 2026-04-07 03:22:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:00.610999 | orchestrator | 2026-04-07 03:23:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:00.613047 | orchestrator | 2026-04-07 03:23:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:00.613130 | orchestrator | 2026-04-07 03:23:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:03.664743 | orchestrator | 2026-04-07 03:23:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:03.665199 | orchestrator | 2026-04-07 03:23:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:03.665322 | orchestrator | 2026-04-07 03:23:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:06.713327 | orchestrator | 2026-04-07 03:23:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:06.713789 | orchestrator | 2026-04-07 03:23:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:06.713840 | orchestrator | 2026-04-07 03:23:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:09.761434 | orchestrator | 2026-04-07 03:23:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:09.761662 | orchestrator | 2026-04-07 03:23:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:09.761689 | orchestrator | 2026-04-07 03:23:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:12.809436 | orchestrator | 2026-04-07 03:23:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:12.811117 | orchestrator | 2026-04-07 03:23:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:12.811156 | orchestrator | 2026-04-07 03:23:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:15.856767 | orchestrator | 2026-04-07 03:23:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:15.858758 | orchestrator | 2026-04-07 03:23:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:15.858836 | orchestrator | 2026-04-07 03:23:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:18.916486 | orchestrator | 2026-04-07 03:23:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:18.919603 | orchestrator | 2026-04-07 03:23:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:18.920303 | orchestrator | 2026-04-07 03:23:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:21.971735 | orchestrator | 2026-04-07 03:23:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:21.974109 | orchestrator | 2026-04-07 03:23:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:21.974178 | orchestrator | 2026-04-07 03:23:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:25.022520 | orchestrator | 2026-04-07 03:23:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:25.024437 | orchestrator | 2026-04-07 03:23:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:25.024501 | orchestrator | 2026-04-07 03:23:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:28.063638 | orchestrator | 2026-04-07 03:23:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:28.064989 | orchestrator | 2026-04-07 03:23:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:28.065063 | orchestrator | 2026-04-07 03:23:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:31.113474 | orchestrator | 2026-04-07 03:23:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:31.115143 | orchestrator | 2026-04-07 03:23:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:31.115231 | orchestrator | 2026-04-07 03:23:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:34.159399 | orchestrator | 2026-04-07 03:23:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:34.161510 | orchestrator | 2026-04-07 03:23:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:34.161628 | orchestrator | 2026-04-07 03:23:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:37.210680 | orchestrator | 2026-04-07 03:23:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:37.214678 | orchestrator | 2026-04-07 03:23:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:37.214799 | orchestrator | 2026-04-07 03:23:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:40.259332 | orchestrator | 2026-04-07 03:23:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:40.260166 | orchestrator | 2026-04-07 03:23:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:40.260483 | orchestrator | 2026-04-07 03:23:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:43.302875 | orchestrator | 2026-04-07 03:23:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:43.303367 | orchestrator | 2026-04-07 03:23:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:43.303434 | orchestrator | 2026-04-07 03:23:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:46.354617 | orchestrator | 2026-04-07 03:23:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:46.356788 | orchestrator | 2026-04-07 03:23:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:46.356905 | orchestrator | 2026-04-07 03:23:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:49.403010 | orchestrator | 2026-04-07 03:23:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:49.404410 | orchestrator | 2026-04-07 03:23:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:49.404459 | orchestrator | 2026-04-07 03:23:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:52.453394 | orchestrator | 2026-04-07 03:23:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:52.454620 | orchestrator | 2026-04-07 03:23:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:52.454672 | orchestrator | 2026-04-07 03:23:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:55.504729 | orchestrator | 2026-04-07 03:23:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:55.506249 | orchestrator | 2026-04-07 03:23:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:55.506304 | orchestrator | 2026-04-07 03:23:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:23:58.553316 | orchestrator | 2026-04-07 03:23:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:23:58.555630 | orchestrator | 2026-04-07 03:23:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:23:58.555874 | orchestrator | 2026-04-07 03:23:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:01.599101 | orchestrator | 2026-04-07 03:24:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:01.601402 | orchestrator | 2026-04-07 03:24:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:01.601447 | orchestrator | 2026-04-07 03:24:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:04.648211 | orchestrator | 2026-04-07 03:24:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:04.650169 | orchestrator | 2026-04-07 03:24:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:04.650209 | orchestrator | 2026-04-07 03:24:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:07.697254 | orchestrator | 2026-04-07 03:24:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:07.699577 | orchestrator | 2026-04-07 03:24:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:07.699609 | orchestrator | 2026-04-07 03:24:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:10.745926 | orchestrator | 2026-04-07 03:24:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:10.747365 | orchestrator | 2026-04-07 03:24:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:10.747570 | orchestrator | 2026-04-07 03:24:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:13.795260 | orchestrator | 2026-04-07 03:24:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:13.797700 | orchestrator | 2026-04-07 03:24:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:13.797785 | orchestrator | 2026-04-07 03:24:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:16.848168 | orchestrator | 2026-04-07 03:24:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:16.850496 | orchestrator | 2026-04-07 03:24:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:16.850552 | orchestrator | 2026-04-07 03:24:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:19.901034 | orchestrator | 2026-04-07 03:24:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:19.903805 | orchestrator | 2026-04-07 03:24:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:19.903876 | orchestrator | 2026-04-07 03:24:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:22.955282 | orchestrator | 2026-04-07 03:24:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:22.957695 | orchestrator | 2026-04-07 03:24:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:22.957740 | orchestrator | 2026-04-07 03:24:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:26.013169 | orchestrator | 2026-04-07 03:24:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:26.013287 | orchestrator | 2026-04-07 03:24:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:26.013312 | orchestrator | 2026-04-07 03:24:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:29.068771 | orchestrator | 2026-04-07 03:24:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:29.070471 | orchestrator | 2026-04-07 03:24:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:29.070684 | orchestrator | 2026-04-07 03:24:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:32.123772 | orchestrator | 2026-04-07 03:24:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:32.126851 | orchestrator | 2026-04-07 03:24:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:32.126950 | orchestrator | 2026-04-07 03:24:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:35.181736 | orchestrator | 2026-04-07 03:24:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:35.183371 | orchestrator | 2026-04-07 03:24:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:35.183410 | orchestrator | 2026-04-07 03:24:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:38.230310 | orchestrator | 2026-04-07 03:24:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:38.230947 | orchestrator | 2026-04-07 03:24:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:38.230991 | orchestrator | 2026-04-07 03:24:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:41.284813 | orchestrator | 2026-04-07 03:24:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:41.286843 | orchestrator | 2026-04-07 03:24:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:41.286912 | orchestrator | 2026-04-07 03:24:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:44.333726 | orchestrator | 2026-04-07 03:24:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:44.336050 | orchestrator | 2026-04-07 03:24:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:44.336163 | orchestrator | 2026-04-07 03:24:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:47.387377 | orchestrator | 2026-04-07 03:24:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:47.390233 | orchestrator | 2026-04-07 03:24:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:47.390309 | orchestrator | 2026-04-07 03:24:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:50.446332 | orchestrator | 2026-04-07 03:24:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:50.451462 | orchestrator | 2026-04-07 03:24:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:50.452222 | orchestrator | 2026-04-07 03:24:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:53.503351 | orchestrator | 2026-04-07 03:24:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:53.505564 | orchestrator | 2026-04-07 03:24:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:53.505708 | orchestrator | 2026-04-07 03:24:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:56.559695 | orchestrator | 2026-04-07 03:24:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:56.561142 | orchestrator | 2026-04-07 03:24:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:56.561244 | orchestrator | 2026-04-07 03:24:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:24:59.603456 | orchestrator | 2026-04-07 03:24:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:24:59.604814 | orchestrator | 2026-04-07 03:24:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:24:59.605137 | orchestrator | 2026-04-07 03:24:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:02.649454 | orchestrator | 2026-04-07 03:25:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:02.653002 | orchestrator | 2026-04-07 03:25:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:02.653107 | orchestrator | 2026-04-07 03:25:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:05.701864 | orchestrator | 2026-04-07 03:25:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:05.702975 | orchestrator | 2026-04-07 03:25:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:05.703163 | orchestrator | 2026-04-07 03:25:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:08.753984 | orchestrator | 2026-04-07 03:25:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:08.756971 | orchestrator | 2026-04-07 03:25:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:08.757046 | orchestrator | 2026-04-07 03:25:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:11.806976 | orchestrator | 2026-04-07 03:25:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:11.807499 | orchestrator | 2026-04-07 03:25:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:11.807525 | orchestrator | 2026-04-07 03:25:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:14.851891 | orchestrator | 2026-04-07 03:25:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:14.853036 | orchestrator | 2026-04-07 03:25:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:14.853110 | orchestrator | 2026-04-07 03:25:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:17.893304 | orchestrator | 2026-04-07 03:25:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:17.895094 | orchestrator | 2026-04-07 03:25:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:17.895238 | orchestrator | 2026-04-07 03:25:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:20.949141 | orchestrator | 2026-04-07 03:25:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:20.950446 | orchestrator | 2026-04-07 03:25:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:20.950489 | orchestrator | 2026-04-07 03:25:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:24.002185 | orchestrator | 2026-04-07 03:25:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:24.003783 | orchestrator | 2026-04-07 03:25:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:24.003847 | orchestrator | 2026-04-07 03:25:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:27.051920 | orchestrator | 2026-04-07 03:25:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:27.054575 | orchestrator | 2026-04-07 03:25:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:27.054710 | orchestrator | 2026-04-07 03:25:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:30.103251 | orchestrator | 2026-04-07 03:25:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:30.104314 | orchestrator | 2026-04-07 03:25:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:30.104465 | orchestrator | 2026-04-07 03:25:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:33.150541 | orchestrator | 2026-04-07 03:25:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:33.152069 | orchestrator | 2026-04-07 03:25:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:33.152183 | orchestrator | 2026-04-07 03:25:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:36.194092 | orchestrator | 2026-04-07 03:25:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:36.194211 | orchestrator | 2026-04-07 03:25:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:36.194221 | orchestrator | 2026-04-07 03:25:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:39.240356 | orchestrator | 2026-04-07 03:25:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:39.241232 | orchestrator | 2026-04-07 03:25:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:39.241419 | orchestrator | 2026-04-07 03:25:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:42.291648 | orchestrator | 2026-04-07 03:25:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:42.293638 | orchestrator | 2026-04-07 03:25:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:42.293680 | orchestrator | 2026-04-07 03:25:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:45.339891 | orchestrator | 2026-04-07 03:25:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:45.341718 | orchestrator | 2026-04-07 03:25:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:45.341779 | orchestrator | 2026-04-07 03:25:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:48.381064 | orchestrator | 2026-04-07 03:25:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:48.381698 | orchestrator | 2026-04-07 03:25:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:48.381733 | orchestrator | 2026-04-07 03:25:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:51.418757 | orchestrator | 2026-04-07 03:25:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:51.420178 | orchestrator | 2026-04-07 03:25:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:51.420327 | orchestrator | 2026-04-07 03:25:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:54.463942 | orchestrator | 2026-04-07 03:25:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:54.466280 | orchestrator | 2026-04-07 03:25:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:54.466369 | orchestrator | 2026-04-07 03:25:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:25:57.514235 | orchestrator | 2026-04-07 03:25:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:25:57.516337 | orchestrator | 2026-04-07 03:25:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:25:57.516393 | orchestrator | 2026-04-07 03:25:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:00.557676 | orchestrator | 2026-04-07 03:26:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:00.560899 | orchestrator | 2026-04-07 03:26:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:00.560936 | orchestrator | 2026-04-07 03:26:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:03.607238 | orchestrator | 2026-04-07 03:26:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:03.608882 | orchestrator | 2026-04-07 03:26:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:03.608932 | orchestrator | 2026-04-07 03:26:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:06.658392 | orchestrator | 2026-04-07 03:26:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:06.660774 | orchestrator | 2026-04-07 03:26:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:06.660826 | orchestrator | 2026-04-07 03:26:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:09.709181 | orchestrator | 2026-04-07 03:26:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:09.710317 | orchestrator | 2026-04-07 03:26:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:09.710401 | orchestrator | 2026-04-07 03:26:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:12.753644 | orchestrator | 2026-04-07 03:26:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:12.755388 | orchestrator | 2026-04-07 03:26:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:12.755453 | orchestrator | 2026-04-07 03:26:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:15.801233 | orchestrator | 2026-04-07 03:26:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:15.802744 | orchestrator | 2026-04-07 03:26:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:15.802783 | orchestrator | 2026-04-07 03:26:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:18.850327 | orchestrator | 2026-04-07 03:26:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:18.852146 | orchestrator | 2026-04-07 03:26:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:18.852206 | orchestrator | 2026-04-07 03:26:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:21.911159 | orchestrator | 2026-04-07 03:26:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:21.913839 | orchestrator | 2026-04-07 03:26:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:21.913880 | orchestrator | 2026-04-07 03:26:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:24.957054 | orchestrator | 2026-04-07 03:26:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:24.958984 | orchestrator | 2026-04-07 03:26:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:24.959033 | orchestrator | 2026-04-07 03:26:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:28.007653 | orchestrator | 2026-04-07 03:26:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:28.009576 | orchestrator | 2026-04-07 03:26:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:28.009630 | orchestrator | 2026-04-07 03:26:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:31.053548 | orchestrator | 2026-04-07 03:26:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:31.055386 | orchestrator | 2026-04-07 03:26:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:31.055512 | orchestrator | 2026-04-07 03:26:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:34.107884 | orchestrator | 2026-04-07 03:26:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:34.109999 | orchestrator | 2026-04-07 03:26:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:34.110110 | orchestrator | 2026-04-07 03:26:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:37.170345 | orchestrator | 2026-04-07 03:26:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:37.171554 | orchestrator | 2026-04-07 03:26:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:37.171581 | orchestrator | 2026-04-07 03:26:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:40.219885 | orchestrator | 2026-04-07 03:26:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:40.222366 | orchestrator | 2026-04-07 03:26:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:40.222426 | orchestrator | 2026-04-07 03:26:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:43.277024 | orchestrator | 2026-04-07 03:26:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:43.280543 | orchestrator | 2026-04-07 03:26:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:43.280619 | orchestrator | 2026-04-07 03:26:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:46.329766 | orchestrator | 2026-04-07 03:26:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:46.333332 | orchestrator | 2026-04-07 03:26:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:46.333505 | orchestrator | 2026-04-07 03:26:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:49.388555 | orchestrator | 2026-04-07 03:26:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:49.390524 | orchestrator | 2026-04-07 03:26:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:49.390579 | orchestrator | 2026-04-07 03:26:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:52.445571 | orchestrator | 2026-04-07 03:26:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:52.446224 | orchestrator | 2026-04-07 03:26:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:52.446274 | orchestrator | 2026-04-07 03:26:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:55.497875 | orchestrator | 2026-04-07 03:26:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:55.498135 | orchestrator | 2026-04-07 03:26:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:55.498168 | orchestrator | 2026-04-07 03:26:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:26:58.548884 | orchestrator | 2026-04-07 03:26:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:26:58.552118 | orchestrator | 2026-04-07 03:26:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:26:58.552271 | orchestrator | 2026-04-07 03:26:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:01.597584 | orchestrator | 2026-04-07 03:27:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:01.599175 | orchestrator | 2026-04-07 03:27:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:01.599233 | orchestrator | 2026-04-07 03:27:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:04.646418 | orchestrator | 2026-04-07 03:27:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:04.647996 | orchestrator | 2026-04-07 03:27:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:04.648185 | orchestrator | 2026-04-07 03:27:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:07.696757 | orchestrator | 2026-04-07 03:27:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:07.699652 | orchestrator | 2026-04-07 03:27:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:07.699840 | orchestrator | 2026-04-07 03:27:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:10.760636 | orchestrator | 2026-04-07 03:27:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:10.763372 | orchestrator | 2026-04-07 03:27:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:10.763483 | orchestrator | 2026-04-07 03:27:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:13.818641 | orchestrator | 2026-04-07 03:27:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:13.821520 | orchestrator | 2026-04-07 03:27:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:13.821601 | orchestrator | 2026-04-07 03:27:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:16.873712 | orchestrator | 2026-04-07 03:27:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:16.876311 | orchestrator | 2026-04-07 03:27:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:16.876462 | orchestrator | 2026-04-07 03:27:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:19.927714 | orchestrator | 2026-04-07 03:27:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:19.928649 | orchestrator | 2026-04-07 03:27:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:19.928774 | orchestrator | 2026-04-07 03:27:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:22.974510 | orchestrator | 2026-04-07 03:27:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:22.976095 | orchestrator | 2026-04-07 03:27:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:22.976166 | orchestrator | 2026-04-07 03:27:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:26.028985 | orchestrator | 2026-04-07 03:27:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:26.033696 | orchestrator | 2026-04-07 03:27:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:26.033845 | orchestrator | 2026-04-07 03:27:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:29.083392 | orchestrator | 2026-04-07 03:27:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:29.087081 | orchestrator | 2026-04-07 03:27:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:29.087206 | orchestrator | 2026-04-07 03:27:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:32.140066 | orchestrator | 2026-04-07 03:27:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:32.141765 | orchestrator | 2026-04-07 03:27:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:32.141823 | orchestrator | 2026-04-07 03:27:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:35.195095 | orchestrator | 2026-04-07 03:27:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:35.200284 | orchestrator | 2026-04-07 03:27:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:35.200431 | orchestrator | 2026-04-07 03:27:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:38.252268 | orchestrator | 2026-04-07 03:27:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:38.256519 | orchestrator | 2026-04-07 03:27:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:38.256632 | orchestrator | 2026-04-07 03:27:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:41.304454 | orchestrator | 2026-04-07 03:27:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:41.305939 | orchestrator | 2026-04-07 03:27:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:41.305972 | orchestrator | 2026-04-07 03:27:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:44.356102 | orchestrator | 2026-04-07 03:27:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:44.358638 | orchestrator | 2026-04-07 03:27:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:44.358700 | orchestrator | 2026-04-07 03:27:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:47.411764 | orchestrator | 2026-04-07 03:27:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:47.414475 | orchestrator | 2026-04-07 03:27:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:47.414534 | orchestrator | 2026-04-07 03:27:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:50.461414 | orchestrator | 2026-04-07 03:27:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:50.463449 | orchestrator | 2026-04-07 03:27:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:50.463509 | orchestrator | 2026-04-07 03:27:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:53.511479 | orchestrator | 2026-04-07 03:27:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:53.512547 | orchestrator | 2026-04-07 03:27:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:53.512695 | orchestrator | 2026-04-07 03:27:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:56.560912 | orchestrator | 2026-04-07 03:27:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:56.562186 | orchestrator | 2026-04-07 03:27:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:56.562261 | orchestrator | 2026-04-07 03:27:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:27:59.609266 | orchestrator | 2026-04-07 03:27:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:27:59.611776 | orchestrator | 2026-04-07 03:27:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:27:59.611911 | orchestrator | 2026-04-07 03:27:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:02.652976 | orchestrator | 2026-04-07 03:28:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:02.655156 | orchestrator | 2026-04-07 03:28:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:02.655232 | orchestrator | 2026-04-07 03:28:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:05.703183 | orchestrator | 2026-04-07 03:28:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:05.705309 | orchestrator | 2026-04-07 03:28:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:05.705390 | orchestrator | 2026-04-07 03:28:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:08.756246 | orchestrator | 2026-04-07 03:28:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:08.758792 | orchestrator | 2026-04-07 03:28:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:08.758914 | orchestrator | 2026-04-07 03:28:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:11.804714 | orchestrator | 2026-04-07 03:28:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:11.807594 | orchestrator | 2026-04-07 03:28:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:11.807660 | orchestrator | 2026-04-07 03:28:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:14.855773 | orchestrator | 2026-04-07 03:28:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:14.858117 | orchestrator | 2026-04-07 03:28:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:14.858171 | orchestrator | 2026-04-07 03:28:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:17.895966 | orchestrator | 2026-04-07 03:28:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:17.896587 | orchestrator | 2026-04-07 03:28:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:17.896647 | orchestrator | 2026-04-07 03:28:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:20.942603 | orchestrator | 2026-04-07 03:28:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:20.944776 | orchestrator | 2026-04-07 03:28:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:20.944857 | orchestrator | 2026-04-07 03:28:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:23.988016 | orchestrator | 2026-04-07 03:28:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:23.990066 | orchestrator | 2026-04-07 03:28:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:23.990121 | orchestrator | 2026-04-07 03:28:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:27.039745 | orchestrator | 2026-04-07 03:28:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:27.041469 | orchestrator | 2026-04-07 03:28:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:27.041568 | orchestrator | 2026-04-07 03:28:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:30.088600 | orchestrator | 2026-04-07 03:28:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:30.091921 | orchestrator | 2026-04-07 03:28:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:30.092001 | orchestrator | 2026-04-07 03:28:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:33.132351 | orchestrator | 2026-04-07 03:28:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:33.132732 | orchestrator | 2026-04-07 03:28:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:33.132833 | orchestrator | 2026-04-07 03:28:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:36.186358 | orchestrator | 2026-04-07 03:28:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:36.188659 | orchestrator | 2026-04-07 03:28:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:36.188707 | orchestrator | 2026-04-07 03:28:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:39.241238 | orchestrator | 2026-04-07 03:28:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:39.243465 | orchestrator | 2026-04-07 03:28:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:39.243725 | orchestrator | 2026-04-07 03:28:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:42.293527 | orchestrator | 2026-04-07 03:28:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:42.296618 | orchestrator | 2026-04-07 03:28:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:42.296687 | orchestrator | 2026-04-07 03:28:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:45.345644 | orchestrator | 2026-04-07 03:28:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:45.347256 | orchestrator | 2026-04-07 03:28:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:45.347326 | orchestrator | 2026-04-07 03:28:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:48.394339 | orchestrator | 2026-04-07 03:28:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:48.396374 | orchestrator | 2026-04-07 03:28:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:48.396433 | orchestrator | 2026-04-07 03:28:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:51.455439 | orchestrator | 2026-04-07 03:28:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:51.457054 | orchestrator | 2026-04-07 03:28:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:51.457087 | orchestrator | 2026-04-07 03:28:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:54.497082 | orchestrator | 2026-04-07 03:28:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:54.498093 | orchestrator | 2026-04-07 03:28:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:54.498132 | orchestrator | 2026-04-07 03:28:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:28:57.542259 | orchestrator | 2026-04-07 03:28:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:28:57.543227 | orchestrator | 2026-04-07 03:28:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:28:57.543289 | orchestrator | 2026-04-07 03:28:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:00.587129 | orchestrator | 2026-04-07 03:29:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:00.589100 | orchestrator | 2026-04-07 03:29:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:00.589172 | orchestrator | 2026-04-07 03:29:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:03.636384 | orchestrator | 2026-04-07 03:29:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:03.638142 | orchestrator | 2026-04-07 03:29:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:03.638199 | orchestrator | 2026-04-07 03:29:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:06.683377 | orchestrator | 2026-04-07 03:29:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:06.685526 | orchestrator | 2026-04-07 03:29:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:06.685580 | orchestrator | 2026-04-07 03:29:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:09.730413 | orchestrator | 2026-04-07 03:29:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:09.732170 | orchestrator | 2026-04-07 03:29:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:09.732235 | orchestrator | 2026-04-07 03:29:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:12.782277 | orchestrator | 2026-04-07 03:29:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:12.784417 | orchestrator | 2026-04-07 03:29:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:12.784476 | orchestrator | 2026-04-07 03:29:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:15.828147 | orchestrator | 2026-04-07 03:29:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:15.830297 | orchestrator | 2026-04-07 03:29:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:15.830402 | orchestrator | 2026-04-07 03:29:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:18.877163 | orchestrator | 2026-04-07 03:29:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:18.878835 | orchestrator | 2026-04-07 03:29:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:18.878890 | orchestrator | 2026-04-07 03:29:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:21.930552 | orchestrator | 2026-04-07 03:29:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:21.933452 | orchestrator | 2026-04-07 03:29:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:21.933509 | orchestrator | 2026-04-07 03:29:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:24.982524 | orchestrator | 2026-04-07 03:29:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:24.984126 | orchestrator | 2026-04-07 03:29:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:24.984183 | orchestrator | 2026-04-07 03:29:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:28.030594 | orchestrator | 2026-04-07 03:29:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:28.030675 | orchestrator | 2026-04-07 03:29:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:28.030686 | orchestrator | 2026-04-07 03:29:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:31.077424 | orchestrator | 2026-04-07 03:29:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:31.078052 | orchestrator | 2026-04-07 03:29:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:31.078067 | orchestrator | 2026-04-07 03:29:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:34.130484 | orchestrator | 2026-04-07 03:29:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:34.131392 | orchestrator | 2026-04-07 03:29:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:34.131445 | orchestrator | 2026-04-07 03:29:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:37.179424 | orchestrator | 2026-04-07 03:29:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:37.181843 | orchestrator | 2026-04-07 03:29:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:37.181914 | orchestrator | 2026-04-07 03:29:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:40.237838 | orchestrator | 2026-04-07 03:29:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:40.240832 | orchestrator | 2026-04-07 03:29:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:40.240880 | orchestrator | 2026-04-07 03:29:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:43.296963 | orchestrator | 2026-04-07 03:29:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:43.299181 | orchestrator | 2026-04-07 03:29:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:43.299274 | orchestrator | 2026-04-07 03:29:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:46.347500 | orchestrator | 2026-04-07 03:29:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:46.348505 | orchestrator | 2026-04-07 03:29:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:46.348546 | orchestrator | 2026-04-07 03:29:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:49.399265 | orchestrator | 2026-04-07 03:29:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:49.401236 | orchestrator | 2026-04-07 03:29:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:49.401262 | orchestrator | 2026-04-07 03:29:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:52.447857 | orchestrator | 2026-04-07 03:29:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:52.448797 | orchestrator | 2026-04-07 03:29:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:52.448855 | orchestrator | 2026-04-07 03:29:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:55.493498 | orchestrator | 2026-04-07 03:29:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:55.494597 | orchestrator | 2026-04-07 03:29:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:55.494630 | orchestrator | 2026-04-07 03:29:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:29:58.544827 | orchestrator | 2026-04-07 03:29:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:29:58.546190 | orchestrator | 2026-04-07 03:29:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:29:58.546544 | orchestrator | 2026-04-07 03:29:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:01.591153 | orchestrator | 2026-04-07 03:30:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:01.592202 | orchestrator | 2026-04-07 03:30:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:01.592263 | orchestrator | 2026-04-07 03:30:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:04.627884 | orchestrator | 2026-04-07 03:30:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:04.630595 | orchestrator | 2026-04-07 03:30:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:04.630781 | orchestrator | 2026-04-07 03:30:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:07.672072 | orchestrator | 2026-04-07 03:30:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:07.673659 | orchestrator | 2026-04-07 03:30:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:07.673717 | orchestrator | 2026-04-07 03:30:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:10.718413 | orchestrator | 2026-04-07 03:30:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:10.720236 | orchestrator | 2026-04-07 03:30:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:10.720263 | orchestrator | 2026-04-07 03:30:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:13.770118 | orchestrator | 2026-04-07 03:30:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:13.771804 | orchestrator | 2026-04-07 03:30:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:13.771840 | orchestrator | 2026-04-07 03:30:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:16.824509 | orchestrator | 2026-04-07 03:30:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:16.825063 | orchestrator | 2026-04-07 03:30:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:16.825094 | orchestrator | 2026-04-07 03:30:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:19.876248 | orchestrator | 2026-04-07 03:30:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:19.877609 | orchestrator | 2026-04-07 03:30:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:19.877733 | orchestrator | 2026-04-07 03:30:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:22.930291 | orchestrator | 2026-04-07 03:30:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:22.932327 | orchestrator | 2026-04-07 03:30:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:22.932395 | orchestrator | 2026-04-07 03:30:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:25.985753 | orchestrator | 2026-04-07 03:30:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:25.988243 | orchestrator | 2026-04-07 03:30:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:25.988319 | orchestrator | 2026-04-07 03:30:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:29.039863 | orchestrator | 2026-04-07 03:30:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:29.041790 | orchestrator | 2026-04-07 03:30:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:29.041842 | orchestrator | 2026-04-07 03:30:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:32.085455 | orchestrator | 2026-04-07 03:30:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:32.087466 | orchestrator | 2026-04-07 03:30:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:32.087520 | orchestrator | 2026-04-07 03:30:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:35.136375 | orchestrator | 2026-04-07 03:30:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:35.137083 | orchestrator | 2026-04-07 03:30:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:35.137124 | orchestrator | 2026-04-07 03:30:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:38.189774 | orchestrator | 2026-04-07 03:30:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:38.191239 | orchestrator | 2026-04-07 03:30:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:38.191273 | orchestrator | 2026-04-07 03:30:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:41.244349 | orchestrator | 2026-04-07 03:30:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:41.246828 | orchestrator | 2026-04-07 03:30:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:41.246891 | orchestrator | 2026-04-07 03:30:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:44.293971 | orchestrator | 2026-04-07 03:30:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:44.294225 | orchestrator | 2026-04-07 03:30:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:44.294249 | orchestrator | 2026-04-07 03:30:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:47.344067 | orchestrator | 2026-04-07 03:30:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:47.345947 | orchestrator | 2026-04-07 03:30:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:47.346172 | orchestrator | 2026-04-07 03:30:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:50.395465 | orchestrator | 2026-04-07 03:30:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:50.395994 | orchestrator | 2026-04-07 03:30:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:50.396694 | orchestrator | 2026-04-07 03:30:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:53.454494 | orchestrator | 2026-04-07 03:30:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:53.454764 | orchestrator | 2026-04-07 03:30:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:53.455077 | orchestrator | 2026-04-07 03:30:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:56.514198 | orchestrator | 2026-04-07 03:30:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:56.516432 | orchestrator | 2026-04-07 03:30:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:56.516499 | orchestrator | 2026-04-07 03:30:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:30:59.565241 | orchestrator | 2026-04-07 03:30:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:30:59.567448 | orchestrator | 2026-04-07 03:30:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:30:59.567525 | orchestrator | 2026-04-07 03:30:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:02.624012 | orchestrator | 2026-04-07 03:31:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:02.626545 | orchestrator | 2026-04-07 03:31:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:02.626682 | orchestrator | 2026-04-07 03:31:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:05.670583 | orchestrator | 2026-04-07 03:31:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:05.671964 | orchestrator | 2026-04-07 03:31:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:05.672045 | orchestrator | 2026-04-07 03:31:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:08.719966 | orchestrator | 2026-04-07 03:31:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:08.722572 | orchestrator | 2026-04-07 03:31:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:08.722825 | orchestrator | 2026-04-07 03:31:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:11.774395 | orchestrator | 2026-04-07 03:31:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:11.776068 | orchestrator | 2026-04-07 03:31:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:11.776246 | orchestrator | 2026-04-07 03:31:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:14.826251 | orchestrator | 2026-04-07 03:31:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:14.828451 | orchestrator | 2026-04-07 03:31:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:14.828523 | orchestrator | 2026-04-07 03:31:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:17.874680 | orchestrator | 2026-04-07 03:31:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:17.875262 | orchestrator | 2026-04-07 03:31:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:17.875314 | orchestrator | 2026-04-07 03:31:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:20.928766 | orchestrator | 2026-04-07 03:31:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:20.930877 | orchestrator | 2026-04-07 03:31:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:20.931502 | orchestrator | 2026-04-07 03:31:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:23.982868 | orchestrator | 2026-04-07 03:31:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:23.984935 | orchestrator | 2026-04-07 03:31:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:23.984983 | orchestrator | 2026-04-07 03:31:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:27.030835 | orchestrator | 2026-04-07 03:31:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:27.032078 | orchestrator | 2026-04-07 03:31:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:27.032128 | orchestrator | 2026-04-07 03:31:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:30.079975 | orchestrator | 2026-04-07 03:31:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:30.081154 | orchestrator | 2026-04-07 03:31:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:30.081238 | orchestrator | 2026-04-07 03:31:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:33.137943 | orchestrator | 2026-04-07 03:31:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:33.140302 | orchestrator | 2026-04-07 03:31:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:33.140427 | orchestrator | 2026-04-07 03:31:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:36.180914 | orchestrator | 2026-04-07 03:31:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:36.182776 | orchestrator | 2026-04-07 03:31:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:36.182822 | orchestrator | 2026-04-07 03:31:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:39.228758 | orchestrator | 2026-04-07 03:31:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:39.230235 | orchestrator | 2026-04-07 03:31:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:39.230269 | orchestrator | 2026-04-07 03:31:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:42.286238 | orchestrator | 2026-04-07 03:31:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:42.287228 | orchestrator | 2026-04-07 03:31:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:42.287251 | orchestrator | 2026-04-07 03:31:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:45.328975 | orchestrator | 2026-04-07 03:31:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:45.330457 | orchestrator | 2026-04-07 03:31:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:45.330553 | orchestrator | 2026-04-07 03:31:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:48.378909 | orchestrator | 2026-04-07 03:31:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:48.379288 | orchestrator | 2026-04-07 03:31:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:48.379335 | orchestrator | 2026-04-07 03:31:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:51.428503 | orchestrator | 2026-04-07 03:31:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:51.431153 | orchestrator | 2026-04-07 03:31:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:51.431200 | orchestrator | 2026-04-07 03:31:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:54.487227 | orchestrator | 2026-04-07 03:31:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:54.488149 | orchestrator | 2026-04-07 03:31:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:54.488201 | orchestrator | 2026-04-07 03:31:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:31:57.538661 | orchestrator | 2026-04-07 03:31:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:31:57.540681 | orchestrator | 2026-04-07 03:31:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:31:57.540740 | orchestrator | 2026-04-07 03:31:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:00.581242 | orchestrator | 2026-04-07 03:32:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:00.583123 | orchestrator | 2026-04-07 03:32:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:00.583154 | orchestrator | 2026-04-07 03:32:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:03.630318 | orchestrator | 2026-04-07 03:32:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:03.630802 | orchestrator | 2026-04-07 03:32:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:03.630834 | orchestrator | 2026-04-07 03:32:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:06.680108 | orchestrator | 2026-04-07 03:32:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:06.683909 | orchestrator | 2026-04-07 03:32:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:06.683995 | orchestrator | 2026-04-07 03:32:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:09.732367 | orchestrator | 2026-04-07 03:32:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:09.732485 | orchestrator | 2026-04-07 03:32:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:09.732642 | orchestrator | 2026-04-07 03:32:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:12.781048 | orchestrator | 2026-04-07 03:32:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:12.782562 | orchestrator | 2026-04-07 03:32:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:12.782606 | orchestrator | 2026-04-07 03:32:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:15.830461 | orchestrator | 2026-04-07 03:32:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:15.832684 | orchestrator | 2026-04-07 03:32:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:15.832745 | orchestrator | 2026-04-07 03:32:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:18.877196 | orchestrator | 2026-04-07 03:32:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:18.877332 | orchestrator | 2026-04-07 03:32:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:18.877345 | orchestrator | 2026-04-07 03:32:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:21.923783 | orchestrator | 2026-04-07 03:32:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:21.924910 | orchestrator | 2026-04-07 03:32:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:21.924985 | orchestrator | 2026-04-07 03:32:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:24.976049 | orchestrator | 2026-04-07 03:32:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:24.978008 | orchestrator | 2026-04-07 03:32:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:24.978117 | orchestrator | 2026-04-07 03:32:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:28.026438 | orchestrator | 2026-04-07 03:32:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:28.028257 | orchestrator | 2026-04-07 03:32:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:28.028332 | orchestrator | 2026-04-07 03:32:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:31.069930 | orchestrator | 2026-04-07 03:32:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:31.071274 | orchestrator | 2026-04-07 03:32:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:31.071318 | orchestrator | 2026-04-07 03:32:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:34.119618 | orchestrator | 2026-04-07 03:32:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:34.122075 | orchestrator | 2026-04-07 03:32:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:34.122155 | orchestrator | 2026-04-07 03:32:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:37.163567 | orchestrator | 2026-04-07 03:32:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:37.164847 | orchestrator | 2026-04-07 03:32:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:37.164886 | orchestrator | 2026-04-07 03:32:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:40.214743 | orchestrator | 2026-04-07 03:32:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:40.217456 | orchestrator | 2026-04-07 03:32:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:40.217553 | orchestrator | 2026-04-07 03:32:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:43.269449 | orchestrator | 2026-04-07 03:32:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:43.273352 | orchestrator | 2026-04-07 03:32:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:43.273456 | orchestrator | 2026-04-07 03:32:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:46.327088 | orchestrator | 2026-04-07 03:32:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:46.328741 | orchestrator | 2026-04-07 03:32:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:46.328785 | orchestrator | 2026-04-07 03:32:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:49.385198 | orchestrator | 2026-04-07 03:32:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:49.387739 | orchestrator | 2026-04-07 03:32:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:49.387790 | orchestrator | 2026-04-07 03:32:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:52.440274 | orchestrator | 2026-04-07 03:32:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:52.441312 | orchestrator | 2026-04-07 03:32:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:52.441399 | orchestrator | 2026-04-07 03:32:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:55.490429 | orchestrator | 2026-04-07 03:32:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:55.493367 | orchestrator | 2026-04-07 03:32:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:55.493416 | orchestrator | 2026-04-07 03:32:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:32:58.537919 | orchestrator | 2026-04-07 03:32:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:32:58.539870 | orchestrator | 2026-04-07 03:32:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:32:58.539947 | orchestrator | 2026-04-07 03:32:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:01.585853 | orchestrator | 2026-04-07 03:33:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:01.587601 | orchestrator | 2026-04-07 03:33:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:01.587679 | orchestrator | 2026-04-07 03:33:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:04.641395 | orchestrator | 2026-04-07 03:33:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:04.641939 | orchestrator | 2026-04-07 03:33:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:04.641973 | orchestrator | 2026-04-07 03:33:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:07.689634 | orchestrator | 2026-04-07 03:33:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:07.690734 | orchestrator | 2026-04-07 03:33:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:07.690797 | orchestrator | 2026-04-07 03:33:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:10.730317 | orchestrator | 2026-04-07 03:33:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:10.730684 | orchestrator | 2026-04-07 03:33:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:10.730725 | orchestrator | 2026-04-07 03:33:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:13.778195 | orchestrator | 2026-04-07 03:33:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:13.780604 | orchestrator | 2026-04-07 03:33:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:13.780678 | orchestrator | 2026-04-07 03:33:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:16.818972 | orchestrator | 2026-04-07 03:33:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:16.819793 | orchestrator | 2026-04-07 03:33:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:16.819847 | orchestrator | 2026-04-07 03:33:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:19.863261 | orchestrator | 2026-04-07 03:33:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:19.865587 | orchestrator | 2026-04-07 03:33:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:19.865655 | orchestrator | 2026-04-07 03:33:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:22.930334 | orchestrator | 2026-04-07 03:33:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:22.932346 | orchestrator | 2026-04-07 03:33:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:22.932413 | orchestrator | 2026-04-07 03:33:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:25.975205 | orchestrator | 2026-04-07 03:33:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:25.976627 | orchestrator | 2026-04-07 03:33:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:25.976730 | orchestrator | 2026-04-07 03:33:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:29.030877 | orchestrator | 2026-04-07 03:33:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:29.033565 | orchestrator | 2026-04-07 03:33:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:29.033692 | orchestrator | 2026-04-07 03:33:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:32.081926 | orchestrator | 2026-04-07 03:33:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:32.084133 | orchestrator | 2026-04-07 03:33:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:32.084262 | orchestrator | 2026-04-07 03:33:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:35.121999 | orchestrator | 2026-04-07 03:33:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:35.123624 | orchestrator | 2026-04-07 03:33:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:35.123678 | orchestrator | 2026-04-07 03:33:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:38.172746 | orchestrator | 2026-04-07 03:33:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:38.174654 | orchestrator | 2026-04-07 03:33:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:38.174699 | orchestrator | 2026-04-07 03:33:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:41.229845 | orchestrator | 2026-04-07 03:33:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:41.230778 | orchestrator | 2026-04-07 03:33:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:41.230820 | orchestrator | 2026-04-07 03:33:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:44.276416 | orchestrator | 2026-04-07 03:33:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:44.277552 | orchestrator | 2026-04-07 03:33:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:44.277590 | orchestrator | 2026-04-07 03:33:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:47.325888 | orchestrator | 2026-04-07 03:33:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:47.329237 | orchestrator | 2026-04-07 03:33:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:47.329296 | orchestrator | 2026-04-07 03:33:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:50.378385 | orchestrator | 2026-04-07 03:33:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:50.378587 | orchestrator | 2026-04-07 03:33:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:50.378604 | orchestrator | 2026-04-07 03:33:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:53.442860 | orchestrator | 2026-04-07 03:33:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:53.444381 | orchestrator | 2026-04-07 03:33:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:53.444479 | orchestrator | 2026-04-07 03:33:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:56.478905 | orchestrator | 2026-04-07 03:33:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:56.480212 | orchestrator | 2026-04-07 03:33:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:56.480301 | orchestrator | 2026-04-07 03:33:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:33:59.529545 | orchestrator | 2026-04-07 03:33:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:33:59.531969 | orchestrator | 2026-04-07 03:33:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:33:59.532053 | orchestrator | 2026-04-07 03:33:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:02.582985 | orchestrator | 2026-04-07 03:34:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:02.584720 | orchestrator | 2026-04-07 03:34:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:02.584748 | orchestrator | 2026-04-07 03:34:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:05.638112 | orchestrator | 2026-04-07 03:34:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:05.641109 | orchestrator | 2026-04-07 03:34:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:05.641175 | orchestrator | 2026-04-07 03:34:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:08.690653 | orchestrator | 2026-04-07 03:34:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:08.694376 | orchestrator | 2026-04-07 03:34:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:08.694541 | orchestrator | 2026-04-07 03:34:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:11.740268 | orchestrator | 2026-04-07 03:34:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:11.742003 | orchestrator | 2026-04-07 03:34:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:11.742266 | orchestrator | 2026-04-07 03:34:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:14.791896 | orchestrator | 2026-04-07 03:34:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:14.795951 | orchestrator | 2026-04-07 03:34:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:14.796052 | orchestrator | 2026-04-07 03:34:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:17.846548 | orchestrator | 2026-04-07 03:34:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:17.848822 | orchestrator | 2026-04-07 03:34:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:17.848873 | orchestrator | 2026-04-07 03:34:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:20.895764 | orchestrator | 2026-04-07 03:34:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:20.897137 | orchestrator | 2026-04-07 03:34:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:20.897177 | orchestrator | 2026-04-07 03:34:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:23.942554 | orchestrator | 2026-04-07 03:34:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:23.944254 | orchestrator | 2026-04-07 03:34:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:23.944316 | orchestrator | 2026-04-07 03:34:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:26.986385 | orchestrator | 2026-04-07 03:34:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:26.987164 | orchestrator | 2026-04-07 03:34:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:26.987238 | orchestrator | 2026-04-07 03:34:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:30.033029 | orchestrator | 2026-04-07 03:34:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:30.033812 | orchestrator | 2026-04-07 03:34:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:30.033869 | orchestrator | 2026-04-07 03:34:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:33.084889 | orchestrator | 2026-04-07 03:34:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:33.085961 | orchestrator | 2026-04-07 03:34:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:33.086075 | orchestrator | 2026-04-07 03:34:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:36.133574 | orchestrator | 2026-04-07 03:34:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:36.134816 | orchestrator | 2026-04-07 03:34:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:36.134863 | orchestrator | 2026-04-07 03:34:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:39.175814 | orchestrator | 2026-04-07 03:34:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:39.178930 | orchestrator | 2026-04-07 03:34:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:39.178995 | orchestrator | 2026-04-07 03:34:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:42.235200 | orchestrator | 2026-04-07 03:34:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:42.237169 | orchestrator | 2026-04-07 03:34:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:42.237243 | orchestrator | 2026-04-07 03:34:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:45.278332 | orchestrator | 2026-04-07 03:34:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:45.278569 | orchestrator | 2026-04-07 03:34:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:45.278583 | orchestrator | 2026-04-07 03:34:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:48.317763 | orchestrator | 2026-04-07 03:34:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:48.319018 | orchestrator | 2026-04-07 03:34:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:48.319091 | orchestrator | 2026-04-07 03:34:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:51.360228 | orchestrator | 2026-04-07 03:34:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:51.361024 | orchestrator | 2026-04-07 03:34:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:51.361104 | orchestrator | 2026-04-07 03:34:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:54.402845 | orchestrator | 2026-04-07 03:34:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:54.403958 | orchestrator | 2026-04-07 03:34:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:54.403994 | orchestrator | 2026-04-07 03:34:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:34:57.454960 | orchestrator | 2026-04-07 03:34:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:34:57.457171 | orchestrator | 2026-04-07 03:34:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:34:57.457579 | orchestrator | 2026-04-07 03:34:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:00.493236 | orchestrator | 2026-04-07 03:35:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:00.494622 | orchestrator | 2026-04-07 03:35:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:00.494735 | orchestrator | 2026-04-07 03:35:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:03.538210 | orchestrator | 2026-04-07 03:35:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:03.540523 | orchestrator | 2026-04-07 03:35:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:03.540584 | orchestrator | 2026-04-07 03:35:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:06.599471 | orchestrator | 2026-04-07 03:35:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:06.600848 | orchestrator | 2026-04-07 03:35:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:06.600913 | orchestrator | 2026-04-07 03:35:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:09.648667 | orchestrator | 2026-04-07 03:35:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:09.650255 | orchestrator | 2026-04-07 03:35:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:09.650358 | orchestrator | 2026-04-07 03:35:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:12.686457 | orchestrator | 2026-04-07 03:35:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:12.687215 | orchestrator | 2026-04-07 03:35:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:12.687254 | orchestrator | 2026-04-07 03:35:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:15.731802 | orchestrator | 2026-04-07 03:35:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:15.733110 | orchestrator | 2026-04-07 03:35:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:15.733171 | orchestrator | 2026-04-07 03:35:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:18.782726 | orchestrator | 2026-04-07 03:35:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:18.785329 | orchestrator | 2026-04-07 03:35:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:18.785421 | orchestrator | 2026-04-07 03:35:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:21.830177 | orchestrator | 2026-04-07 03:35:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:21.830451 | orchestrator | 2026-04-07 03:35:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:21.830484 | orchestrator | 2026-04-07 03:35:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:24.878559 | orchestrator | 2026-04-07 03:35:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:24.879843 | orchestrator | 2026-04-07 03:35:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:24.879942 | orchestrator | 2026-04-07 03:35:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:27.933719 | orchestrator | 2026-04-07 03:35:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:27.935320 | orchestrator | 2026-04-07 03:35:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:27.935402 | orchestrator | 2026-04-07 03:35:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:30.981272 | orchestrator | 2026-04-07 03:35:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:30.982477 | orchestrator | 2026-04-07 03:35:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:30.982512 | orchestrator | 2026-04-07 03:35:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:34.033553 | orchestrator | 2026-04-07 03:35:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:34.034884 | orchestrator | 2026-04-07 03:35:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:34.034939 | orchestrator | 2026-04-07 03:35:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:37.071801 | orchestrator | 2026-04-07 03:35:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:37.071986 | orchestrator | 2026-04-07 03:35:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:37.072006 | orchestrator | 2026-04-07 03:35:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:40.115626 | orchestrator | 2026-04-07 03:35:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:40.116412 | orchestrator | 2026-04-07 03:35:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:40.116492 | orchestrator | 2026-04-07 03:35:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:43.165833 | orchestrator | 2026-04-07 03:35:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:43.167093 | orchestrator | 2026-04-07 03:35:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:43.167144 | orchestrator | 2026-04-07 03:35:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:46.211157 | orchestrator | 2026-04-07 03:35:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:46.214113 | orchestrator | 2026-04-07 03:35:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:46.214188 | orchestrator | 2026-04-07 03:35:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:49.264313 | orchestrator | 2026-04-07 03:35:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:49.266177 | orchestrator | 2026-04-07 03:35:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:49.266223 | orchestrator | 2026-04-07 03:35:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:52.320041 | orchestrator | 2026-04-07 03:35:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:52.320564 | orchestrator | 2026-04-07 03:35:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:52.321385 | orchestrator | 2026-04-07 03:35:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:55.366528 | orchestrator | 2026-04-07 03:35:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:55.367507 | orchestrator | 2026-04-07 03:35:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:55.367595 | orchestrator | 2026-04-07 03:35:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:35:58.417235 | orchestrator | 2026-04-07 03:35:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:35:58.418962 | orchestrator | 2026-04-07 03:35:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:35:58.419008 | orchestrator | 2026-04-07 03:35:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:01.470286 | orchestrator | 2026-04-07 03:36:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:01.470799 | orchestrator | 2026-04-07 03:36:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:01.470842 | orchestrator | 2026-04-07 03:36:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:04.516800 | orchestrator | 2026-04-07 03:36:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:04.519218 | orchestrator | 2026-04-07 03:36:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:04.519376 | orchestrator | 2026-04-07 03:36:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:07.563657 | orchestrator | 2026-04-07 03:36:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:07.565149 | orchestrator | 2026-04-07 03:36:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:07.565338 | orchestrator | 2026-04-07 03:36:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:10.615661 | orchestrator | 2026-04-07 03:36:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:10.617921 | orchestrator | 2026-04-07 03:36:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:10.617983 | orchestrator | 2026-04-07 03:36:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:13.660593 | orchestrator | 2026-04-07 03:36:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:13.660925 | orchestrator | 2026-04-07 03:36:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:13.660953 | orchestrator | 2026-04-07 03:36:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:16.708608 | orchestrator | 2026-04-07 03:36:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:16.710945 | orchestrator | 2026-04-07 03:36:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:16.711046 | orchestrator | 2026-04-07 03:36:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:19.756942 | orchestrator | 2026-04-07 03:36:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:19.758227 | orchestrator | 2026-04-07 03:36:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:19.758317 | orchestrator | 2026-04-07 03:36:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:22.811764 | orchestrator | 2026-04-07 03:36:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:22.812766 | orchestrator | 2026-04-07 03:36:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:22.812984 | orchestrator | 2026-04-07 03:36:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:25.861453 | orchestrator | 2026-04-07 03:36:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:25.863198 | orchestrator | 2026-04-07 03:36:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:25.863366 | orchestrator | 2026-04-07 03:36:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:28.911210 | orchestrator | 2026-04-07 03:36:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:28.912247 | orchestrator | 2026-04-07 03:36:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:28.912325 | orchestrator | 2026-04-07 03:36:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:31.966735 | orchestrator | 2026-04-07 03:36:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:31.968592 | orchestrator | 2026-04-07 03:36:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:31.968655 | orchestrator | 2026-04-07 03:36:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:35.014268 | orchestrator | 2026-04-07 03:36:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:35.014672 | orchestrator | 2026-04-07 03:36:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:35.014702 | orchestrator | 2026-04-07 03:36:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:38.062694 | orchestrator | 2026-04-07 03:36:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:38.063920 | orchestrator | 2026-04-07 03:36:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:38.064095 | orchestrator | 2026-04-07 03:36:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:41.111071 | orchestrator | 2026-04-07 03:36:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:41.113006 | orchestrator | 2026-04-07 03:36:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:41.113075 | orchestrator | 2026-04-07 03:36:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:44.161538 | orchestrator | 2026-04-07 03:36:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:44.163526 | orchestrator | 2026-04-07 03:36:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:44.163601 | orchestrator | 2026-04-07 03:36:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:47.206467 | orchestrator | 2026-04-07 03:36:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:47.208536 | orchestrator | 2026-04-07 03:36:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:47.208622 | orchestrator | 2026-04-07 03:36:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:50.251697 | orchestrator | 2026-04-07 03:36:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:50.254598 | orchestrator | 2026-04-07 03:36:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:50.254656 | orchestrator | 2026-04-07 03:36:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:53.303902 | orchestrator | 2026-04-07 03:36:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:53.306635 | orchestrator | 2026-04-07 03:36:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:53.306705 | orchestrator | 2026-04-07 03:36:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:56.358540 | orchestrator | 2026-04-07 03:36:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:56.360810 | orchestrator | 2026-04-07 03:36:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:56.360870 | orchestrator | 2026-04-07 03:36:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:36:59.405638 | orchestrator | 2026-04-07 03:36:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:36:59.407067 | orchestrator | 2026-04-07 03:36:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:36:59.407122 | orchestrator | 2026-04-07 03:36:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:02.456634 | orchestrator | 2026-04-07 03:37:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:02.458596 | orchestrator | 2026-04-07 03:37:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:02.458647 | orchestrator | 2026-04-07 03:37:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:05.509040 | orchestrator | 2026-04-07 03:37:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:05.511551 | orchestrator | 2026-04-07 03:37:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:05.511612 | orchestrator | 2026-04-07 03:37:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:08.560085 | orchestrator | 2026-04-07 03:37:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:08.561089 | orchestrator | 2026-04-07 03:37:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:08.561143 | orchestrator | 2026-04-07 03:37:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:11.609920 | orchestrator | 2026-04-07 03:37:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:11.612777 | orchestrator | 2026-04-07 03:37:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:11.612844 | orchestrator | 2026-04-07 03:37:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:14.654775 | orchestrator | 2026-04-07 03:37:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:14.657174 | orchestrator | 2026-04-07 03:37:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:14.657218 | orchestrator | 2026-04-07 03:37:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:17.697868 | orchestrator | 2026-04-07 03:37:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:17.700223 | orchestrator | 2026-04-07 03:37:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:17.700341 | orchestrator | 2026-04-07 03:37:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:20.751854 | orchestrator | 2026-04-07 03:37:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:20.753979 | orchestrator | 2026-04-07 03:37:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:20.754126 | orchestrator | 2026-04-07 03:37:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:23.802224 | orchestrator | 2026-04-07 03:37:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:23.803860 | orchestrator | 2026-04-07 03:37:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:23.803940 | orchestrator | 2026-04-07 03:37:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:26.855686 | orchestrator | 2026-04-07 03:37:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:26.856780 | orchestrator | 2026-04-07 03:37:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:26.856846 | orchestrator | 2026-04-07 03:37:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:29.907053 | orchestrator | 2026-04-07 03:37:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:29.909654 | orchestrator | 2026-04-07 03:37:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:29.909711 | orchestrator | 2026-04-07 03:37:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:32.958786 | orchestrator | 2026-04-07 03:37:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:32.960210 | orchestrator | 2026-04-07 03:37:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:32.960312 | orchestrator | 2026-04-07 03:37:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:36.010213 | orchestrator | 2026-04-07 03:37:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:36.015972 | orchestrator | 2026-04-07 03:37:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:36.016059 | orchestrator | 2026-04-07 03:37:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:39.060699 | orchestrator | 2026-04-07 03:37:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:39.062373 | orchestrator | 2026-04-07 03:37:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:39.062727 | orchestrator | 2026-04-07 03:37:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:42.105379 | orchestrator | 2026-04-07 03:37:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:42.106580 | orchestrator | 2026-04-07 03:37:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:42.106628 | orchestrator | 2026-04-07 03:37:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:45.151650 | orchestrator | 2026-04-07 03:37:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:37:45.152332 | orchestrator | 2026-04-07 03:37:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:37:45.152375 | orchestrator | 2026-04-07 03:37:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:37:48.199453 | orchestrator | 2026-04-07 03:37:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:39:48.291906 | orchestrator | 2026-04-07 03:39:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:39:48.291988 | orchestrator | 2026-04-07 03:39:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:39:51.333356 | orchestrator | 2026-04-07 03:39:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:39:51.334178 | orchestrator | 2026-04-07 03:39:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:39:51.334225 | orchestrator | 2026-04-07 03:39:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:39:54.369927 | orchestrator | 2026-04-07 03:39:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:39:54.373205 | orchestrator | 2026-04-07 03:39:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:39:54.373278 | orchestrator | 2026-04-07 03:39:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:39:57.406987 | orchestrator | 2026-04-07 03:39:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:39:57.409819 | orchestrator | 2026-04-07 03:39:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:39:57.409909 | orchestrator | 2026-04-07 03:39:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:00.459314 | orchestrator | 2026-04-07 03:40:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:00.461065 | orchestrator | 2026-04-07 03:40:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:00.461179 | orchestrator | 2026-04-07 03:40:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:03.506744 | orchestrator | 2026-04-07 03:40:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:03.508294 | orchestrator | 2026-04-07 03:40:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:03.508334 | orchestrator | 2026-04-07 03:40:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:06.560333 | orchestrator | 2026-04-07 03:40:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:06.562761 | orchestrator | 2026-04-07 03:40:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:06.562821 | orchestrator | 2026-04-07 03:40:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:09.616302 | orchestrator | 2026-04-07 03:40:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:09.619319 | orchestrator | 2026-04-07 03:40:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:09.619358 | orchestrator | 2026-04-07 03:40:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:12.655788 | orchestrator | 2026-04-07 03:40:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:12.657391 | orchestrator | 2026-04-07 03:40:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:12.657469 | orchestrator | 2026-04-07 03:40:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:15.699696 | orchestrator | 2026-04-07 03:40:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:15.702286 | orchestrator | 2026-04-07 03:40:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:15.702435 | orchestrator | 2026-04-07 03:40:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:18.738772 | orchestrator | 2026-04-07 03:40:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:18.740919 | orchestrator | 2026-04-07 03:40:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:18.740959 | orchestrator | 2026-04-07 03:40:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:21.787317 | orchestrator | 2026-04-07 03:40:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:21.789990 | orchestrator | 2026-04-07 03:40:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:21.790263 | orchestrator | 2026-04-07 03:40:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:24.840215 | orchestrator | 2026-04-07 03:40:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:24.841981 | orchestrator | 2026-04-07 03:40:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:24.842123 | orchestrator | 2026-04-07 03:40:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:27.878788 | orchestrator | 2026-04-07 03:40:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:27.879563 | orchestrator | 2026-04-07 03:40:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:27.879695 | orchestrator | 2026-04-07 03:40:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:30.922861 | orchestrator | 2026-04-07 03:40:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:30.924967 | orchestrator | 2026-04-07 03:40:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:30.925015 | orchestrator | 2026-04-07 03:40:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:33.962667 | orchestrator | 2026-04-07 03:40:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:33.964527 | orchestrator | 2026-04-07 03:40:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:33.964554 | orchestrator | 2026-04-07 03:40:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:37.020431 | orchestrator | 2026-04-07 03:40:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:37.023293 | orchestrator | 2026-04-07 03:40:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:37.023390 | orchestrator | 2026-04-07 03:40:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:40.064018 | orchestrator | 2026-04-07 03:40:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:40.066659 | orchestrator | 2026-04-07 03:40:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:40.066713 | orchestrator | 2026-04-07 03:40:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:43.108852 | orchestrator | 2026-04-07 03:40:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:43.109818 | orchestrator | 2026-04-07 03:40:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:43.109874 | orchestrator | 2026-04-07 03:40:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:46.154433 | orchestrator | 2026-04-07 03:40:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:46.155321 | orchestrator | 2026-04-07 03:40:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:46.155350 | orchestrator | 2026-04-07 03:40:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:49.198404 | orchestrator | 2026-04-07 03:40:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:49.205160 | orchestrator | 2026-04-07 03:40:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:49.205252 | orchestrator | 2026-04-07 03:40:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:52.243184 | orchestrator | 2026-04-07 03:40:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:52.244545 | orchestrator | 2026-04-07 03:40:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:52.244607 | orchestrator | 2026-04-07 03:40:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:55.284784 | orchestrator | 2026-04-07 03:40:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:55.287578 | orchestrator | 2026-04-07 03:40:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:55.287639 | orchestrator | 2026-04-07 03:40:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:40:58.325957 | orchestrator | 2026-04-07 03:40:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:40:58.326811 | orchestrator | 2026-04-07 03:40:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:40:58.326842 | orchestrator | 2026-04-07 03:40:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:01.373612 | orchestrator | 2026-04-07 03:41:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:01.375435 | orchestrator | 2026-04-07 03:41:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:01.375502 | orchestrator | 2026-04-07 03:41:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:04.418858 | orchestrator | 2026-04-07 03:41:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:04.422118 | orchestrator | 2026-04-07 03:41:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:04.422367 | orchestrator | 2026-04-07 03:41:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:07.461849 | orchestrator | 2026-04-07 03:41:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:07.465091 | orchestrator | 2026-04-07 03:41:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:07.465171 | orchestrator | 2026-04-07 03:41:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:10.495792 | orchestrator | 2026-04-07 03:41:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:10.496425 | orchestrator | 2026-04-07 03:41:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:10.496468 | orchestrator | 2026-04-07 03:41:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:13.535640 | orchestrator | 2026-04-07 03:41:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:13.536897 | orchestrator | 2026-04-07 03:41:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:13.536992 | orchestrator | 2026-04-07 03:41:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:16.573945 | orchestrator | 2026-04-07 03:41:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:16.576184 | orchestrator | 2026-04-07 03:41:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:16.576306 | orchestrator | 2026-04-07 03:41:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:19.615276 | orchestrator | 2026-04-07 03:41:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:19.616256 | orchestrator | 2026-04-07 03:41:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:19.616297 | orchestrator | 2026-04-07 03:41:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:22.653665 | orchestrator | 2026-04-07 03:41:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:22.655194 | orchestrator | 2026-04-07 03:41:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:22.655284 | orchestrator | 2026-04-07 03:41:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:25.719687 | orchestrator | 2026-04-07 03:41:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:25.721185 | orchestrator | 2026-04-07 03:41:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:25.721258 | orchestrator | 2026-04-07 03:41:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:28.769467 | orchestrator | 2026-04-07 03:41:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:28.772199 | orchestrator | 2026-04-07 03:41:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:28.772257 | orchestrator | 2026-04-07 03:41:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:31.815302 | orchestrator | 2026-04-07 03:41:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:31.815812 | orchestrator | 2026-04-07 03:41:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:31.815843 | orchestrator | 2026-04-07 03:41:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:34.855867 | orchestrator | 2026-04-07 03:41:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:34.857921 | orchestrator | 2026-04-07 03:41:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:34.857978 | orchestrator | 2026-04-07 03:41:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:37.903281 | orchestrator | 2026-04-07 03:41:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:37.904360 | orchestrator | 2026-04-07 03:41:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:37.904417 | orchestrator | 2026-04-07 03:41:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:40.955705 | orchestrator | 2026-04-07 03:41:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:40.958774 | orchestrator | 2026-04-07 03:41:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:40.958852 | orchestrator | 2026-04-07 03:41:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:43.999426 | orchestrator | 2026-04-07 03:41:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:44.001063 | orchestrator | 2026-04-07 03:41:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:44.001139 | orchestrator | 2026-04-07 03:41:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:47.044908 | orchestrator | 2026-04-07 03:41:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:47.045776 | orchestrator | 2026-04-07 03:41:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:47.045828 | orchestrator | 2026-04-07 03:41:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:50.081465 | orchestrator | 2026-04-07 03:41:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:50.082430 | orchestrator | 2026-04-07 03:41:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:50.082486 | orchestrator | 2026-04-07 03:41:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:53.125893 | orchestrator | 2026-04-07 03:41:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:53.128268 | orchestrator | 2026-04-07 03:41:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:53.128325 | orchestrator | 2026-04-07 03:41:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:56.163165 | orchestrator | 2026-04-07 03:41:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:56.166162 | orchestrator | 2026-04-07 03:41:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:56.166238 | orchestrator | 2026-04-07 03:41:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:41:59.207429 | orchestrator | 2026-04-07 03:41:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:41:59.210247 | orchestrator | 2026-04-07 03:41:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:41:59.210310 | orchestrator | 2026-04-07 03:41:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:02.259729 | orchestrator | 2026-04-07 03:42:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:02.261480 | orchestrator | 2026-04-07 03:42:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:02.261516 | orchestrator | 2026-04-07 03:42:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:05.296026 | orchestrator | 2026-04-07 03:42:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:05.296118 | orchestrator | 2026-04-07 03:42:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:05.296125 | orchestrator | 2026-04-07 03:42:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:08.343113 | orchestrator | 2026-04-07 03:42:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:08.344136 | orchestrator | 2026-04-07 03:42:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:08.344269 | orchestrator | 2026-04-07 03:42:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:11.389460 | orchestrator | 2026-04-07 03:42:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:11.389754 | orchestrator | 2026-04-07 03:42:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:11.390356 | orchestrator | 2026-04-07 03:42:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:14.439043 | orchestrator | 2026-04-07 03:42:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:14.440023 | orchestrator | 2026-04-07 03:42:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:14.440046 | orchestrator | 2026-04-07 03:42:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:17.491905 | orchestrator | 2026-04-07 03:42:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:17.495127 | orchestrator | 2026-04-07 03:42:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:17.495193 | orchestrator | 2026-04-07 03:42:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:20.546510 | orchestrator | 2026-04-07 03:42:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:20.549907 | orchestrator | 2026-04-07 03:42:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:20.550094 | orchestrator | 2026-04-07 03:42:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:23.606140 | orchestrator | 2026-04-07 03:42:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:23.608128 | orchestrator | 2026-04-07 03:42:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:23.608183 | orchestrator | 2026-04-07 03:42:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:26.652435 | orchestrator | 2026-04-07 03:42:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:26.653756 | orchestrator | 2026-04-07 03:42:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:26.653837 | orchestrator | 2026-04-07 03:42:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:29.699619 | orchestrator | 2026-04-07 03:42:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:29.700050 | orchestrator | 2026-04-07 03:42:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:29.700080 | orchestrator | 2026-04-07 03:42:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:32.760762 | orchestrator | 2026-04-07 03:42:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:32.763358 | orchestrator | 2026-04-07 03:42:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:32.763438 | orchestrator | 2026-04-07 03:42:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:35.810380 | orchestrator | 2026-04-07 03:42:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:35.812799 | orchestrator | 2026-04-07 03:42:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:35.812835 | orchestrator | 2026-04-07 03:42:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:38.860304 | orchestrator | 2026-04-07 03:42:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:38.861488 | orchestrator | 2026-04-07 03:42:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:38.861552 | orchestrator | 2026-04-07 03:42:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:41.895063 | orchestrator | 2026-04-07 03:42:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:41.897593 | orchestrator | 2026-04-07 03:42:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:41.897644 | orchestrator | 2026-04-07 03:42:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:44.949165 | orchestrator | 2026-04-07 03:42:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:44.949476 | orchestrator | 2026-04-07 03:42:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:44.949515 | orchestrator | 2026-04-07 03:42:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:48.008789 | orchestrator | 2026-04-07 03:42:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:48.011332 | orchestrator | 2026-04-07 03:42:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:48.011380 | orchestrator | 2026-04-07 03:42:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:51.058439 | orchestrator | 2026-04-07 03:42:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:51.060982 | orchestrator | 2026-04-07 03:42:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:51.061162 | orchestrator | 2026-04-07 03:42:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:54.112666 | orchestrator | 2026-04-07 03:42:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:54.113801 | orchestrator | 2026-04-07 03:42:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:54.113849 | orchestrator | 2026-04-07 03:42:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:42:57.159375 | orchestrator | 2026-04-07 03:42:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:42:57.160992 | orchestrator | 2026-04-07 03:42:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:42:57.161185 | orchestrator | 2026-04-07 03:42:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:00.199303 | orchestrator | 2026-04-07 03:43:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:00.200159 | orchestrator | 2026-04-07 03:43:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:00.200441 | orchestrator | 2026-04-07 03:43:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:03.241695 | orchestrator | 2026-04-07 03:43:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:03.242479 | orchestrator | 2026-04-07 03:43:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:03.242538 | orchestrator | 2026-04-07 03:43:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:06.293121 | orchestrator | 2026-04-07 03:43:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:06.295218 | orchestrator | 2026-04-07 03:43:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:06.295293 | orchestrator | 2026-04-07 03:43:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:09.332330 | orchestrator | 2026-04-07 03:43:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:09.333627 | orchestrator | 2026-04-07 03:43:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:09.333672 | orchestrator | 2026-04-07 03:43:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:12.377215 | orchestrator | 2026-04-07 03:43:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:12.379162 | orchestrator | 2026-04-07 03:43:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:12.379215 | orchestrator | 2026-04-07 03:43:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:15.414130 | orchestrator | 2026-04-07 03:43:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:15.414708 | orchestrator | 2026-04-07 03:43:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:15.414791 | orchestrator | 2026-04-07 03:43:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:18.454581 | orchestrator | 2026-04-07 03:43:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:18.456231 | orchestrator | 2026-04-07 03:43:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:18.456281 | orchestrator | 2026-04-07 03:43:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:21.502187 | orchestrator | 2026-04-07 03:43:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:21.504544 | orchestrator | 2026-04-07 03:43:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:21.504592 | orchestrator | 2026-04-07 03:43:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:24.560529 | orchestrator | 2026-04-07 03:43:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:24.563007 | orchestrator | 2026-04-07 03:43:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:24.563073 | orchestrator | 2026-04-07 03:43:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:27.609037 | orchestrator | 2026-04-07 03:43:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:27.609802 | orchestrator | 2026-04-07 03:43:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:27.609893 | orchestrator | 2026-04-07 03:43:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:30.658762 | orchestrator | 2026-04-07 03:43:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:30.661054 | orchestrator | 2026-04-07 03:43:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:30.661127 | orchestrator | 2026-04-07 03:43:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:33.717067 | orchestrator | 2026-04-07 03:43:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:33.717638 | orchestrator | 2026-04-07 03:43:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:33.717678 | orchestrator | 2026-04-07 03:43:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:36.760727 | orchestrator | 2026-04-07 03:43:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:36.763551 | orchestrator | 2026-04-07 03:43:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:36.763623 | orchestrator | 2026-04-07 03:43:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:39.805104 | orchestrator | 2026-04-07 03:43:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:39.808057 | orchestrator | 2026-04-07 03:43:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:39.808113 | orchestrator | 2026-04-07 03:43:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:42.851853 | orchestrator | 2026-04-07 03:43:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:42.852776 | orchestrator | 2026-04-07 03:43:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:42.852854 | orchestrator | 2026-04-07 03:43:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:45.903017 | orchestrator | 2026-04-07 03:43:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:45.904936 | orchestrator | 2026-04-07 03:43:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:45.904985 | orchestrator | 2026-04-07 03:43:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:48.956951 | orchestrator | 2026-04-07 03:43:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:48.960314 | orchestrator | 2026-04-07 03:43:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:48.960445 | orchestrator | 2026-04-07 03:43:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:52.014827 | orchestrator | 2026-04-07 03:43:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:52.016534 | orchestrator | 2026-04-07 03:43:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:52.016572 | orchestrator | 2026-04-07 03:43:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:55.062246 | orchestrator | 2026-04-07 03:43:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:55.063865 | orchestrator | 2026-04-07 03:43:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:55.063971 | orchestrator | 2026-04-07 03:43:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:43:58.101176 | orchestrator | 2026-04-07 03:43:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:43:58.102217 | orchestrator | 2026-04-07 03:43:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:43:58.102264 | orchestrator | 2026-04-07 03:43:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:01.142190 | orchestrator | 2026-04-07 03:44:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:01.142751 | orchestrator | 2026-04-07 03:44:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:01.142773 | orchestrator | 2026-04-07 03:44:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:04.187345 | orchestrator | 2026-04-07 03:44:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:04.190262 | orchestrator | 2026-04-07 03:44:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:04.190332 | orchestrator | 2026-04-07 03:44:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:07.234971 | orchestrator | 2026-04-07 03:44:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:07.236708 | orchestrator | 2026-04-07 03:44:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:07.236797 | orchestrator | 2026-04-07 03:44:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:10.277966 | orchestrator | 2026-04-07 03:44:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:10.279757 | orchestrator | 2026-04-07 03:44:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:10.279801 | orchestrator | 2026-04-07 03:44:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:13.320569 | orchestrator | 2026-04-07 03:44:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:13.321307 | orchestrator | 2026-04-07 03:44:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:13.321348 | orchestrator | 2026-04-07 03:44:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:16.366586 | orchestrator | 2026-04-07 03:44:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:16.368258 | orchestrator | 2026-04-07 03:44:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:16.368308 | orchestrator | 2026-04-07 03:44:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:19.416834 | orchestrator | 2026-04-07 03:44:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:19.419041 | orchestrator | 2026-04-07 03:44:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:19.419102 | orchestrator | 2026-04-07 03:44:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:22.464674 | orchestrator | 2026-04-07 03:44:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:22.467210 | orchestrator | 2026-04-07 03:44:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:22.467300 | orchestrator | 2026-04-07 03:44:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:25.515549 | orchestrator | 2026-04-07 03:44:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:25.517495 | orchestrator | 2026-04-07 03:44:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:25.517563 | orchestrator | 2026-04-07 03:44:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:28.566240 | orchestrator | 2026-04-07 03:44:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:28.569183 | orchestrator | 2026-04-07 03:44:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:28.569273 | orchestrator | 2026-04-07 03:44:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:31.607214 | orchestrator | 2026-04-07 03:44:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:31.608903 | orchestrator | 2026-04-07 03:44:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:31.608967 | orchestrator | 2026-04-07 03:44:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:34.651593 | orchestrator | 2026-04-07 03:44:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:34.652749 | orchestrator | 2026-04-07 03:44:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:34.652785 | orchestrator | 2026-04-07 03:44:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:37.701346 | orchestrator | 2026-04-07 03:44:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:37.702537 | orchestrator | 2026-04-07 03:44:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:37.702714 | orchestrator | 2026-04-07 03:44:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:40.752375 | orchestrator | 2026-04-07 03:44:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:40.755430 | orchestrator | 2026-04-07 03:44:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:40.755519 | orchestrator | 2026-04-07 03:44:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:43.805452 | orchestrator | 2026-04-07 03:44:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:43.806213 | orchestrator | 2026-04-07 03:44:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:43.806256 | orchestrator | 2026-04-07 03:44:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:46.854314 | orchestrator | 2026-04-07 03:44:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:46.858258 | orchestrator | 2026-04-07 03:44:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:46.858362 | orchestrator | 2026-04-07 03:44:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:49.907310 | orchestrator | 2026-04-07 03:44:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:49.908907 | orchestrator | 2026-04-07 03:44:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:49.908950 | orchestrator | 2026-04-07 03:44:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:52.956216 | orchestrator | 2026-04-07 03:44:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:52.959198 | orchestrator | 2026-04-07 03:44:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:52.959258 | orchestrator | 2026-04-07 03:44:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:56.009402 | orchestrator | 2026-04-07 03:44:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:56.011544 | orchestrator | 2026-04-07 03:44:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:56.011937 | orchestrator | 2026-04-07 03:44:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:44:59.053298 | orchestrator | 2026-04-07 03:44:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:44:59.053988 | orchestrator | 2026-04-07 03:44:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:44:59.054110 | orchestrator | 2026-04-07 03:44:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:02.097069 | orchestrator | 2026-04-07 03:45:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:02.099150 | orchestrator | 2026-04-07 03:45:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:02.099216 | orchestrator | 2026-04-07 03:45:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:05.135031 | orchestrator | 2026-04-07 03:45:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:05.136079 | orchestrator | 2026-04-07 03:45:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:05.136120 | orchestrator | 2026-04-07 03:45:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:08.183994 | orchestrator | 2026-04-07 03:45:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:08.185322 | orchestrator | 2026-04-07 03:45:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:08.185373 | orchestrator | 2026-04-07 03:45:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:11.224226 | orchestrator | 2026-04-07 03:45:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:11.225655 | orchestrator | 2026-04-07 03:45:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:11.225732 | orchestrator | 2026-04-07 03:45:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:14.270340 | orchestrator | 2026-04-07 03:45:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:14.271639 | orchestrator | 2026-04-07 03:45:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:14.271733 | orchestrator | 2026-04-07 03:45:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:17.313082 | orchestrator | 2026-04-07 03:45:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:17.314128 | orchestrator | 2026-04-07 03:45:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:17.314206 | orchestrator | 2026-04-07 03:45:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:20.362469 | orchestrator | 2026-04-07 03:45:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:20.363453 | orchestrator | 2026-04-07 03:45:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:20.363488 | orchestrator | 2026-04-07 03:45:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:23.412312 | orchestrator | 2026-04-07 03:45:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:23.415423 | orchestrator | 2026-04-07 03:45:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:23.415495 | orchestrator | 2026-04-07 03:45:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:26.460249 | orchestrator | 2026-04-07 03:45:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:26.461532 | orchestrator | 2026-04-07 03:45:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:26.461621 | orchestrator | 2026-04-07 03:45:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:29.510407 | orchestrator | 2026-04-07 03:45:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:29.511502 | orchestrator | 2026-04-07 03:45:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:29.511528 | orchestrator | 2026-04-07 03:45:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:32.547723 | orchestrator | 2026-04-07 03:45:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:32.548031 | orchestrator | 2026-04-07 03:45:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:32.548053 | orchestrator | 2026-04-07 03:45:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:35.598774 | orchestrator | 2026-04-07 03:45:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:35.600678 | orchestrator | 2026-04-07 03:45:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:35.600725 | orchestrator | 2026-04-07 03:45:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:38.653941 | orchestrator | 2026-04-07 03:45:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:38.656172 | orchestrator | 2026-04-07 03:45:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:38.656347 | orchestrator | 2026-04-07 03:45:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:41.708095 | orchestrator | 2026-04-07 03:45:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:41.710139 | orchestrator | 2026-04-07 03:45:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:41.710198 | orchestrator | 2026-04-07 03:45:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:44.755847 | orchestrator | 2026-04-07 03:45:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:44.758459 | orchestrator | 2026-04-07 03:45:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:44.758549 | orchestrator | 2026-04-07 03:45:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:47.804475 | orchestrator | 2026-04-07 03:45:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:47.805302 | orchestrator | 2026-04-07 03:45:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:47.805386 | orchestrator | 2026-04-07 03:45:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:50.854279 | orchestrator | 2026-04-07 03:45:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:50.856529 | orchestrator | 2026-04-07 03:45:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:50.856575 | orchestrator | 2026-04-07 03:45:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:53.910846 | orchestrator | 2026-04-07 03:45:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:53.911962 | orchestrator | 2026-04-07 03:45:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:53.912026 | orchestrator | 2026-04-07 03:45:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:45:56.955656 | orchestrator | 2026-04-07 03:45:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:45:56.956909 | orchestrator | 2026-04-07 03:45:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:45:56.956972 | orchestrator | 2026-04-07 03:45:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:00.008302 | orchestrator | 2026-04-07 03:46:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:00.010762 | orchestrator | 2026-04-07 03:46:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:00.011150 | orchestrator | 2026-04-07 03:46:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:03.057512 | orchestrator | 2026-04-07 03:46:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:03.058564 | orchestrator | 2026-04-07 03:46:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:03.058605 | orchestrator | 2026-04-07 03:46:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:06.098613 | orchestrator | 2026-04-07 03:46:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:06.100124 | orchestrator | 2026-04-07 03:46:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:06.100175 | orchestrator | 2026-04-07 03:46:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:09.141337 | orchestrator | 2026-04-07 03:46:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:09.141917 | orchestrator | 2026-04-07 03:46:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:09.141975 | orchestrator | 2026-04-07 03:46:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:12.186117 | orchestrator | 2026-04-07 03:46:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:12.187262 | orchestrator | 2026-04-07 03:46:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:12.187451 | orchestrator | 2026-04-07 03:46:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:15.235580 | orchestrator | 2026-04-07 03:46:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:15.236893 | orchestrator | 2026-04-07 03:46:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:15.236923 | orchestrator | 2026-04-07 03:46:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:18.289000 | orchestrator | 2026-04-07 03:46:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:18.290463 | orchestrator | 2026-04-07 03:46:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:18.290521 | orchestrator | 2026-04-07 03:46:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:21.337496 | orchestrator | 2026-04-07 03:46:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:21.338476 | orchestrator | 2026-04-07 03:46:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:21.338531 | orchestrator | 2026-04-07 03:46:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:24.380520 | orchestrator | 2026-04-07 03:46:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:24.383366 | orchestrator | 2026-04-07 03:46:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:24.383414 | orchestrator | 2026-04-07 03:46:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:27.423329 | orchestrator | 2026-04-07 03:46:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:27.424657 | orchestrator | 2026-04-07 03:46:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:27.424720 | orchestrator | 2026-04-07 03:46:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:30.474858 | orchestrator | 2026-04-07 03:46:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:30.475476 | orchestrator | 2026-04-07 03:46:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:30.475511 | orchestrator | 2026-04-07 03:46:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:33.522474 | orchestrator | 2026-04-07 03:46:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:33.524310 | orchestrator | 2026-04-07 03:46:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:33.524365 | orchestrator | 2026-04-07 03:46:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:36.574426 | orchestrator | 2026-04-07 03:46:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:36.577240 | orchestrator | 2026-04-07 03:46:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:36.577358 | orchestrator | 2026-04-07 03:46:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:39.622619 | orchestrator | 2026-04-07 03:46:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:39.623349 | orchestrator | 2026-04-07 03:46:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:39.623398 | orchestrator | 2026-04-07 03:46:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:42.673297 | orchestrator | 2026-04-07 03:46:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:42.674390 | orchestrator | 2026-04-07 03:46:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:42.674484 | orchestrator | 2026-04-07 03:46:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:45.719607 | orchestrator | 2026-04-07 03:46:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:45.721792 | orchestrator | 2026-04-07 03:46:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:45.721864 | orchestrator | 2026-04-07 03:46:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:48.772017 | orchestrator | 2026-04-07 03:46:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:48.773672 | orchestrator | 2026-04-07 03:46:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:48.773818 | orchestrator | 2026-04-07 03:46:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:51.819503 | orchestrator | 2026-04-07 03:46:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:51.822495 | orchestrator | 2026-04-07 03:46:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:51.822583 | orchestrator | 2026-04-07 03:46:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:54.884538 | orchestrator | 2026-04-07 03:46:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:54.886338 | orchestrator | 2026-04-07 03:46:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:54.886372 | orchestrator | 2026-04-07 03:46:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:46:57.926913 | orchestrator | 2026-04-07 03:46:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:46:57.928501 | orchestrator | 2026-04-07 03:46:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:46:57.928525 | orchestrator | 2026-04-07 03:46:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:00.969133 | orchestrator | 2026-04-07 03:47:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:00.970134 | orchestrator | 2026-04-07 03:47:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:00.970157 | orchestrator | 2026-04-07 03:47:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:04.014915 | orchestrator | 2026-04-07 03:47:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:04.016452 | orchestrator | 2026-04-07 03:47:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:04.016543 | orchestrator | 2026-04-07 03:47:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:07.064540 | orchestrator | 2026-04-07 03:47:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:07.066854 | orchestrator | 2026-04-07 03:47:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:07.066987 | orchestrator | 2026-04-07 03:47:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:10.104707 | orchestrator | 2026-04-07 03:47:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:10.107543 | orchestrator | 2026-04-07 03:47:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:10.107676 | orchestrator | 2026-04-07 03:47:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:13.152228 | orchestrator | 2026-04-07 03:47:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:13.153019 | orchestrator | 2026-04-07 03:47:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:13.153057 | orchestrator | 2026-04-07 03:47:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:16.201307 | orchestrator | 2026-04-07 03:47:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:16.203566 | orchestrator | 2026-04-07 03:47:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:16.203779 | orchestrator | 2026-04-07 03:47:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:19.250282 | orchestrator | 2026-04-07 03:47:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:19.251856 | orchestrator | 2026-04-07 03:47:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:19.251914 | orchestrator | 2026-04-07 03:47:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:22.287037 | orchestrator | 2026-04-07 03:47:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:22.288424 | orchestrator | 2026-04-07 03:47:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:22.288664 | orchestrator | 2026-04-07 03:47:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:25.337769 | orchestrator | 2026-04-07 03:47:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:25.339512 | orchestrator | 2026-04-07 03:47:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:25.339593 | orchestrator | 2026-04-07 03:47:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:28.391445 | orchestrator | 2026-04-07 03:47:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:28.393160 | orchestrator | 2026-04-07 03:47:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:28.393205 | orchestrator | 2026-04-07 03:47:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:31.438447 | orchestrator | 2026-04-07 03:47:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:31.441595 | orchestrator | 2026-04-07 03:47:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:31.441668 | orchestrator | 2026-04-07 03:47:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:34.480284 | orchestrator | 2026-04-07 03:47:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:34.483208 | orchestrator | 2026-04-07 03:47:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:34.483300 | orchestrator | 2026-04-07 03:47:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:37.518774 | orchestrator | 2026-04-07 03:47:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:37.519584 | orchestrator | 2026-04-07 03:47:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:37.519664 | orchestrator | 2026-04-07 03:47:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:40.547444 | orchestrator | 2026-04-07 03:47:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:40.548649 | orchestrator | 2026-04-07 03:47:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:40.548797 | orchestrator | 2026-04-07 03:47:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:43.585140 | orchestrator | 2026-04-07 03:47:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:43.585445 | orchestrator | 2026-04-07 03:47:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:43.585473 | orchestrator | 2026-04-07 03:47:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:46.635252 | orchestrator | 2026-04-07 03:47:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:46.638631 | orchestrator | 2026-04-07 03:47:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:46.638741 | orchestrator | 2026-04-07 03:47:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:49.680179 | orchestrator | 2026-04-07 03:47:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:49.684196 | orchestrator | 2026-04-07 03:47:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:49.684268 | orchestrator | 2026-04-07 03:47:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:52.734992 | orchestrator | 2026-04-07 03:47:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:52.735869 | orchestrator | 2026-04-07 03:47:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:52.735901 | orchestrator | 2026-04-07 03:47:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:55.783274 | orchestrator | 2026-04-07 03:47:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:55.784910 | orchestrator | 2026-04-07 03:47:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:55.784990 | orchestrator | 2026-04-07 03:47:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:47:58.829252 | orchestrator | 2026-04-07 03:47:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:47:58.831204 | orchestrator | 2026-04-07 03:47:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:47:58.831248 | orchestrator | 2026-04-07 03:47:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:01.879046 | orchestrator | 2026-04-07 03:48:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:01.880287 | orchestrator | 2026-04-07 03:48:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:01.880374 | orchestrator | 2026-04-07 03:48:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:04.926532 | orchestrator | 2026-04-07 03:48:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:04.929245 | orchestrator | 2026-04-07 03:48:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:04.929289 | orchestrator | 2026-04-07 03:48:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:07.975058 | orchestrator | 2026-04-07 03:48:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:07.977568 | orchestrator | 2026-04-07 03:48:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:07.977630 | orchestrator | 2026-04-07 03:48:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:11.019724 | orchestrator | 2026-04-07 03:48:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:11.021202 | orchestrator | 2026-04-07 03:48:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:11.021260 | orchestrator | 2026-04-07 03:48:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:14.064508 | orchestrator | 2026-04-07 03:48:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:14.067188 | orchestrator | 2026-04-07 03:48:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:14.067247 | orchestrator | 2026-04-07 03:48:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:17.104966 | orchestrator | 2026-04-07 03:48:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:17.105946 | orchestrator | 2026-04-07 03:48:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:17.106059 | orchestrator | 2026-04-07 03:48:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:20.140001 | orchestrator | 2026-04-07 03:48:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:20.141417 | orchestrator | 2026-04-07 03:48:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:20.141454 | orchestrator | 2026-04-07 03:48:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:23.187781 | orchestrator | 2026-04-07 03:48:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:23.189012 | orchestrator | 2026-04-07 03:48:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:23.189121 | orchestrator | 2026-04-07 03:48:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:26.232511 | orchestrator | 2026-04-07 03:48:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:26.234558 | orchestrator | 2026-04-07 03:48:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:26.234686 | orchestrator | 2026-04-07 03:48:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:29.275859 | orchestrator | 2026-04-07 03:48:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:29.276906 | orchestrator | 2026-04-07 03:48:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:29.277035 | orchestrator | 2026-04-07 03:48:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:32.323324 | orchestrator | 2026-04-07 03:48:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:32.324933 | orchestrator | 2026-04-07 03:48:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:32.324998 | orchestrator | 2026-04-07 03:48:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:35.368875 | orchestrator | 2026-04-07 03:48:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:35.370319 | orchestrator | 2026-04-07 03:48:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:35.370365 | orchestrator | 2026-04-07 03:48:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:38.421578 | orchestrator | 2026-04-07 03:48:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:38.422894 | orchestrator | 2026-04-07 03:48:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:38.423273 | orchestrator | 2026-04-07 03:48:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:41.468218 | orchestrator | 2026-04-07 03:48:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:41.469954 | orchestrator | 2026-04-07 03:48:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:41.470003 | orchestrator | 2026-04-07 03:48:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:44.520284 | orchestrator | 2026-04-07 03:48:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:44.522578 | orchestrator | 2026-04-07 03:48:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:44.522700 | orchestrator | 2026-04-07 03:48:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:47.576602 | orchestrator | 2026-04-07 03:48:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:47.577734 | orchestrator | 2026-04-07 03:48:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:47.577793 | orchestrator | 2026-04-07 03:48:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:50.616388 | orchestrator | 2026-04-07 03:48:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:50.617533 | orchestrator | 2026-04-07 03:48:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:50.617578 | orchestrator | 2026-04-07 03:48:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:53.662838 | orchestrator | 2026-04-07 03:48:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:53.663330 | orchestrator | 2026-04-07 03:48:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:53.663364 | orchestrator | 2026-04-07 03:48:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:56.713963 | orchestrator | 2026-04-07 03:48:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:56.718141 | orchestrator | 2026-04-07 03:48:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:56.718256 | orchestrator | 2026-04-07 03:48:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:48:59.772066 | orchestrator | 2026-04-07 03:48:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:48:59.773825 | orchestrator | 2026-04-07 03:48:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:48:59.773868 | orchestrator | 2026-04-07 03:48:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:02.824991 | orchestrator | 2026-04-07 03:49:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:02.826732 | orchestrator | 2026-04-07 03:49:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:02.826775 | orchestrator | 2026-04-07 03:49:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:05.878789 | orchestrator | 2026-04-07 03:49:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:05.881048 | orchestrator | 2026-04-07 03:49:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:05.881135 | orchestrator | 2026-04-07 03:49:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:08.934246 | orchestrator | 2026-04-07 03:49:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:08.935671 | orchestrator | 2026-04-07 03:49:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:08.935712 | orchestrator | 2026-04-07 03:49:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:11.981806 | orchestrator | 2026-04-07 03:49:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:11.983014 | orchestrator | 2026-04-07 03:49:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:11.983076 | orchestrator | 2026-04-07 03:49:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:15.034383 | orchestrator | 2026-04-07 03:49:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:15.036690 | orchestrator | 2026-04-07 03:49:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:15.036764 | orchestrator | 2026-04-07 03:49:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:18.082425 | orchestrator | 2026-04-07 03:49:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:18.083206 | orchestrator | 2026-04-07 03:49:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:18.083250 | orchestrator | 2026-04-07 03:49:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:21.136448 | orchestrator | 2026-04-07 03:49:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:21.138304 | orchestrator | 2026-04-07 03:49:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:21.138379 | orchestrator | 2026-04-07 03:49:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:24.176822 | orchestrator | 2026-04-07 03:49:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:24.177878 | orchestrator | 2026-04-07 03:49:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:24.177927 | orchestrator | 2026-04-07 03:49:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:27.230188 | orchestrator | 2026-04-07 03:49:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:27.231008 | orchestrator | 2026-04-07 03:49:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:27.231072 | orchestrator | 2026-04-07 03:49:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:30.273300 | orchestrator | 2026-04-07 03:49:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:30.274048 | orchestrator | 2026-04-07 03:49:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:30.274155 | orchestrator | 2026-04-07 03:49:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:33.318876 | orchestrator | 2026-04-07 03:49:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:33.322316 | orchestrator | 2026-04-07 03:49:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:33.322482 | orchestrator | 2026-04-07 03:49:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:36.370001 | orchestrator | 2026-04-07 03:49:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:36.372995 | orchestrator | 2026-04-07 03:49:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:36.373057 | orchestrator | 2026-04-07 03:49:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:39.421188 | orchestrator | 2026-04-07 03:49:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:39.424039 | orchestrator | 2026-04-07 03:49:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:39.424084 | orchestrator | 2026-04-07 03:49:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:42.468469 | orchestrator | 2026-04-07 03:49:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:42.469209 | orchestrator | 2026-04-07 03:49:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:42.469318 | orchestrator | 2026-04-07 03:49:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:45.522383 | orchestrator | 2026-04-07 03:49:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:45.524891 | orchestrator | 2026-04-07 03:49:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:45.524969 | orchestrator | 2026-04-07 03:49:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:48.571361 | orchestrator | 2026-04-07 03:49:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:48.575360 | orchestrator | 2026-04-07 03:49:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:48.575433 | orchestrator | 2026-04-07 03:49:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:51.616000 | orchestrator | 2026-04-07 03:49:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:51.617915 | orchestrator | 2026-04-07 03:49:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:51.617973 | orchestrator | 2026-04-07 03:49:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:54.674997 | orchestrator | 2026-04-07 03:49:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:54.677455 | orchestrator | 2026-04-07 03:49:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:54.677532 | orchestrator | 2026-04-07 03:49:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:49:57.720682 | orchestrator | 2026-04-07 03:49:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:49:57.721783 | orchestrator | 2026-04-07 03:49:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:49:57.721862 | orchestrator | 2026-04-07 03:49:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:00.771316 | orchestrator | 2026-04-07 03:50:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:00.773300 | orchestrator | 2026-04-07 03:50:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:00.773409 | orchestrator | 2026-04-07 03:50:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:03.819965 | orchestrator | 2026-04-07 03:50:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:03.821302 | orchestrator | 2026-04-07 03:50:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:03.821360 | orchestrator | 2026-04-07 03:50:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:06.865812 | orchestrator | 2026-04-07 03:50:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:06.866546 | orchestrator | 2026-04-07 03:50:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:06.866601 | orchestrator | 2026-04-07 03:50:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:09.916020 | orchestrator | 2026-04-07 03:50:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:09.917543 | orchestrator | 2026-04-07 03:50:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:09.917654 | orchestrator | 2026-04-07 03:50:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:12.967856 | orchestrator | 2026-04-07 03:50:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:12.969414 | orchestrator | 2026-04-07 03:50:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:12.969467 | orchestrator | 2026-04-07 03:50:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:16.019984 | orchestrator | 2026-04-07 03:50:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:16.021419 | orchestrator | 2026-04-07 03:50:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:16.021469 | orchestrator | 2026-04-07 03:50:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:19.060307 | orchestrator | 2026-04-07 03:50:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:19.061276 | orchestrator | 2026-04-07 03:50:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:19.061340 | orchestrator | 2026-04-07 03:50:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:22.098175 | orchestrator | 2026-04-07 03:50:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:22.098652 | orchestrator | 2026-04-07 03:50:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:22.098672 | orchestrator | 2026-04-07 03:50:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:25.142420 | orchestrator | 2026-04-07 03:50:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:25.143280 | orchestrator | 2026-04-07 03:50:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:25.143349 | orchestrator | 2026-04-07 03:50:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:28.190855 | orchestrator | 2026-04-07 03:50:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:28.191966 | orchestrator | 2026-04-07 03:50:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:28.191990 | orchestrator | 2026-04-07 03:50:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:31.231993 | orchestrator | 2026-04-07 03:50:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:31.233170 | orchestrator | 2026-04-07 03:50:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:31.233255 | orchestrator | 2026-04-07 03:50:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:34.283485 | orchestrator | 2026-04-07 03:50:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:34.285329 | orchestrator | 2026-04-07 03:50:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:34.285400 | orchestrator | 2026-04-07 03:50:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:37.328858 | orchestrator | 2026-04-07 03:50:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:37.331174 | orchestrator | 2026-04-07 03:50:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:37.331225 | orchestrator | 2026-04-07 03:50:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:40.375518 | orchestrator | 2026-04-07 03:50:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:40.376147 | orchestrator | 2026-04-07 03:50:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:40.376166 | orchestrator | 2026-04-07 03:50:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:43.425465 | orchestrator | 2026-04-07 03:50:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:43.428104 | orchestrator | 2026-04-07 03:50:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:43.428189 | orchestrator | 2026-04-07 03:50:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:46.471328 | orchestrator | 2026-04-07 03:50:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:46.473155 | orchestrator | 2026-04-07 03:50:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:46.473206 | orchestrator | 2026-04-07 03:50:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:49.515849 | orchestrator | 2026-04-07 03:50:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:49.516834 | orchestrator | 2026-04-07 03:50:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:49.516881 | orchestrator | 2026-04-07 03:50:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:52.576848 | orchestrator | 2026-04-07 03:50:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:52.581950 | orchestrator | 2026-04-07 03:50:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:52.582138 | orchestrator | 2026-04-07 03:50:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:55.625809 | orchestrator | 2026-04-07 03:50:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:55.626594 | orchestrator | 2026-04-07 03:50:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:55.626643 | orchestrator | 2026-04-07 03:50:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:50:58.676163 | orchestrator | 2026-04-07 03:50:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:50:58.678160 | orchestrator | 2026-04-07 03:50:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:50:58.678295 | orchestrator | 2026-04-07 03:50:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:01.736383 | orchestrator | 2026-04-07 03:51:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:01.737097 | orchestrator | 2026-04-07 03:51:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:01.737175 | orchestrator | 2026-04-07 03:51:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:04.789060 | orchestrator | 2026-04-07 03:51:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:04.790305 | orchestrator | 2026-04-07 03:51:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:04.790358 | orchestrator | 2026-04-07 03:51:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:07.844892 | orchestrator | 2026-04-07 03:51:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:07.846382 | orchestrator | 2026-04-07 03:51:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:07.846432 | orchestrator | 2026-04-07 03:51:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:10.901559 | orchestrator | 2026-04-07 03:51:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:10.904287 | orchestrator | 2026-04-07 03:51:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:10.904346 | orchestrator | 2026-04-07 03:51:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:13.953110 | orchestrator | 2026-04-07 03:51:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:13.956486 | orchestrator | 2026-04-07 03:51:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:13.956618 | orchestrator | 2026-04-07 03:51:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:17.006611 | orchestrator | 2026-04-07 03:51:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:17.009338 | orchestrator | 2026-04-07 03:51:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:17.009410 | orchestrator | 2026-04-07 03:51:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:20.048180 | orchestrator | 2026-04-07 03:51:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:20.048762 | orchestrator | 2026-04-07 03:51:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:20.049195 | orchestrator | 2026-04-07 03:51:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:23.088786 | orchestrator | 2026-04-07 03:51:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:23.089261 | orchestrator | 2026-04-07 03:51:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:23.089277 | orchestrator | 2026-04-07 03:51:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:26.127024 | orchestrator | 2026-04-07 03:51:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:26.130593 | orchestrator | 2026-04-07 03:51:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:26.130667 | orchestrator | 2026-04-07 03:51:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:29.175391 | orchestrator | 2026-04-07 03:51:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:29.177260 | orchestrator | 2026-04-07 03:51:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:29.177324 | orchestrator | 2026-04-07 03:51:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:32.228612 | orchestrator | 2026-04-07 03:51:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:32.230199 | orchestrator | 2026-04-07 03:51:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:32.230268 | orchestrator | 2026-04-07 03:51:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:35.278074 | orchestrator | 2026-04-07 03:51:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:35.280016 | orchestrator | 2026-04-07 03:51:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:35.280080 | orchestrator | 2026-04-07 03:51:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:38.320610 | orchestrator | 2026-04-07 03:51:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:38.322419 | orchestrator | 2026-04-07 03:51:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:38.322467 | orchestrator | 2026-04-07 03:51:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:41.363210 | orchestrator | 2026-04-07 03:51:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:41.365179 | orchestrator | 2026-04-07 03:51:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:41.365246 | orchestrator | 2026-04-07 03:51:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:44.411845 | orchestrator | 2026-04-07 03:51:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:44.414212 | orchestrator | 2026-04-07 03:51:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:44.414300 | orchestrator | 2026-04-07 03:51:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:47.456117 | orchestrator | 2026-04-07 03:51:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:47.458867 | orchestrator | 2026-04-07 03:51:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:47.458949 | orchestrator | 2026-04-07 03:51:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:50.510870 | orchestrator | 2026-04-07 03:51:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:50.511228 | orchestrator | 2026-04-07 03:51:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:50.511270 | orchestrator | 2026-04-07 03:51:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:53.552057 | orchestrator | 2026-04-07 03:51:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:53.554090 | orchestrator | 2026-04-07 03:51:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:53.554171 | orchestrator | 2026-04-07 03:51:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:56.590922 | orchestrator | 2026-04-07 03:51:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:56.591723 | orchestrator | 2026-04-07 03:51:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:56.591768 | orchestrator | 2026-04-07 03:51:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:51:59.636424 | orchestrator | 2026-04-07 03:51:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:51:59.640030 | orchestrator | 2026-04-07 03:51:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:51:59.640082 | orchestrator | 2026-04-07 03:51:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:02.685810 | orchestrator | 2026-04-07 03:52:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:02.687745 | orchestrator | 2026-04-07 03:52:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:02.687782 | orchestrator | 2026-04-07 03:52:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:05.735776 | orchestrator | 2026-04-07 03:52:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:05.737599 | orchestrator | 2026-04-07 03:52:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:05.737640 | orchestrator | 2026-04-07 03:52:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:08.787810 | orchestrator | 2026-04-07 03:52:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:08.790247 | orchestrator | 2026-04-07 03:52:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:08.790286 | orchestrator | 2026-04-07 03:52:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:11.834355 | orchestrator | 2026-04-07 03:52:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:11.835325 | orchestrator | 2026-04-07 03:52:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:11.835416 | orchestrator | 2026-04-07 03:52:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:14.879755 | orchestrator | 2026-04-07 03:52:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:14.881468 | orchestrator | 2026-04-07 03:52:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:14.881622 | orchestrator | 2026-04-07 03:52:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:17.930906 | orchestrator | 2026-04-07 03:52:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:17.932765 | orchestrator | 2026-04-07 03:52:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:17.932856 | orchestrator | 2026-04-07 03:52:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:20.979928 | orchestrator | 2026-04-07 03:52:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:20.982125 | orchestrator | 2026-04-07 03:52:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:20.982205 | orchestrator | 2026-04-07 03:52:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:24.029859 | orchestrator | 2026-04-07 03:52:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:24.031175 | orchestrator | 2026-04-07 03:52:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:24.031235 | orchestrator | 2026-04-07 03:52:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:27.073740 | orchestrator | 2026-04-07 03:52:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:27.076514 | orchestrator | 2026-04-07 03:52:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:27.076570 | orchestrator | 2026-04-07 03:52:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:30.129182 | orchestrator | 2026-04-07 03:52:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:30.130732 | orchestrator | 2026-04-07 03:52:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:30.130809 | orchestrator | 2026-04-07 03:52:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:33.174522 | orchestrator | 2026-04-07 03:52:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:33.174768 | orchestrator | 2026-04-07 03:52:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:33.174788 | orchestrator | 2026-04-07 03:52:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:36.213987 | orchestrator | 2026-04-07 03:52:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:36.215897 | orchestrator | 2026-04-07 03:52:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:36.215948 | orchestrator | 2026-04-07 03:52:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:39.250901 | orchestrator | 2026-04-07 03:52:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:39.252118 | orchestrator | 2026-04-07 03:52:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:39.252159 | orchestrator | 2026-04-07 03:52:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:42.292951 | orchestrator | 2026-04-07 03:52:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:42.296493 | orchestrator | 2026-04-07 03:52:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:42.296547 | orchestrator | 2026-04-07 03:52:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:45.343282 | orchestrator | 2026-04-07 03:52:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:45.345856 | orchestrator | 2026-04-07 03:52:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:45.345944 | orchestrator | 2026-04-07 03:52:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:48.389251 | orchestrator | 2026-04-07 03:52:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:48.390579 | orchestrator | 2026-04-07 03:52:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:48.390622 | orchestrator | 2026-04-07 03:52:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:51.441195 | orchestrator | 2026-04-07 03:52:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:51.444696 | orchestrator | 2026-04-07 03:52:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:51.444751 | orchestrator | 2026-04-07 03:52:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:54.486687 | orchestrator | 2026-04-07 03:52:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:54.489098 | orchestrator | 2026-04-07 03:52:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:54.489168 | orchestrator | 2026-04-07 03:52:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:52:57.538406 | orchestrator | 2026-04-07 03:52:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:52:57.540004 | orchestrator | 2026-04-07 03:52:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:52:57.540058 | orchestrator | 2026-04-07 03:52:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:00.591109 | orchestrator | 2026-04-07 03:53:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:00.592657 | orchestrator | 2026-04-07 03:53:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:00.592732 | orchestrator | 2026-04-07 03:53:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:03.633564 | orchestrator | 2026-04-07 03:53:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:03.635346 | orchestrator | 2026-04-07 03:53:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:03.635397 | orchestrator | 2026-04-07 03:53:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:06.684807 | orchestrator | 2026-04-07 03:53:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:06.685740 | orchestrator | 2026-04-07 03:53:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:06.685780 | orchestrator | 2026-04-07 03:53:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:09.729154 | orchestrator | 2026-04-07 03:53:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:09.730602 | orchestrator | 2026-04-07 03:53:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:09.730663 | orchestrator | 2026-04-07 03:53:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:12.778310 | orchestrator | 2026-04-07 03:53:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:12.780896 | orchestrator | 2026-04-07 03:53:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:12.780958 | orchestrator | 2026-04-07 03:53:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:15.829174 | orchestrator | 2026-04-07 03:53:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:15.831961 | orchestrator | 2026-04-07 03:53:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:15.832099 | orchestrator | 2026-04-07 03:53:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:18.877150 | orchestrator | 2026-04-07 03:53:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:18.877747 | orchestrator | 2026-04-07 03:53:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:18.877818 | orchestrator | 2026-04-07 03:53:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:21.920252 | orchestrator | 2026-04-07 03:53:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:21.922202 | orchestrator | 2026-04-07 03:53:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:21.922257 | orchestrator | 2026-04-07 03:53:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:24.969931 | orchestrator | 2026-04-07 03:53:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:24.971963 | orchestrator | 2026-04-07 03:53:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:24.972025 | orchestrator | 2026-04-07 03:53:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:28.024209 | orchestrator | 2026-04-07 03:53:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:28.025449 | orchestrator | 2026-04-07 03:53:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:28.025485 | orchestrator | 2026-04-07 03:53:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:31.065082 | orchestrator | 2026-04-07 03:53:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:31.066597 | orchestrator | 2026-04-07 03:53:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:31.066637 | orchestrator | 2026-04-07 03:53:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:34.106097 | orchestrator | 2026-04-07 03:53:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:34.107598 | orchestrator | 2026-04-07 03:53:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:34.107646 | orchestrator | 2026-04-07 03:53:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:37.157997 | orchestrator | 2026-04-07 03:53:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:37.159600 | orchestrator | 2026-04-07 03:53:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:37.159666 | orchestrator | 2026-04-07 03:53:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:40.192919 | orchestrator | 2026-04-07 03:53:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:40.194227 | orchestrator | 2026-04-07 03:53:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:40.194299 | orchestrator | 2026-04-07 03:53:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:43.242207 | orchestrator | 2026-04-07 03:53:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:43.243857 | orchestrator | 2026-04-07 03:53:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:43.243882 | orchestrator | 2026-04-07 03:53:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:46.286205 | orchestrator | 2026-04-07 03:53:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:46.288854 | orchestrator | 2026-04-07 03:53:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:46.288912 | orchestrator | 2026-04-07 03:53:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:49.335937 | orchestrator | 2026-04-07 03:53:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:49.337637 | orchestrator | 2026-04-07 03:53:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:49.337689 | orchestrator | 2026-04-07 03:53:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:52.377383 | orchestrator | 2026-04-07 03:53:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:52.379459 | orchestrator | 2026-04-07 03:53:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:52.379525 | orchestrator | 2026-04-07 03:53:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:55.416125 | orchestrator | 2026-04-07 03:53:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:55.417754 | orchestrator | 2026-04-07 03:53:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:55.417787 | orchestrator | 2026-04-07 03:53:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:53:58.462684 | orchestrator | 2026-04-07 03:53:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:53:58.463456 | orchestrator | 2026-04-07 03:53:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:53:58.463485 | orchestrator | 2026-04-07 03:53:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:01.508813 | orchestrator | 2026-04-07 03:54:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:01.511334 | orchestrator | 2026-04-07 03:54:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:01.511467 | orchestrator | 2026-04-07 03:54:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:04.558186 | orchestrator | 2026-04-07 03:54:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:04.559107 | orchestrator | 2026-04-07 03:54:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:04.559242 | orchestrator | 2026-04-07 03:54:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:07.608736 | orchestrator | 2026-04-07 03:54:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:07.610337 | orchestrator | 2026-04-07 03:54:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:07.610474 | orchestrator | 2026-04-07 03:54:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:10.661514 | orchestrator | 2026-04-07 03:54:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:10.663287 | orchestrator | 2026-04-07 03:54:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:10.663406 | orchestrator | 2026-04-07 03:54:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:13.714858 | orchestrator | 2026-04-07 03:54:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:13.716676 | orchestrator | 2026-04-07 03:54:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:13.716744 | orchestrator | 2026-04-07 03:54:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:16.765433 | orchestrator | 2026-04-07 03:54:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:16.767919 | orchestrator | 2026-04-07 03:54:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:16.767986 | orchestrator | 2026-04-07 03:54:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:19.814427 | orchestrator | 2026-04-07 03:54:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:19.815567 | orchestrator | 2026-04-07 03:54:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:19.815610 | orchestrator | 2026-04-07 03:54:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:22.861710 | orchestrator | 2026-04-07 03:54:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:22.864033 | orchestrator | 2026-04-07 03:54:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:22.864127 | orchestrator | 2026-04-07 03:54:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:25.913859 | orchestrator | 2026-04-07 03:54:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:25.915766 | orchestrator | 2026-04-07 03:54:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:25.915811 | orchestrator | 2026-04-07 03:54:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:28.964557 | orchestrator | 2026-04-07 03:54:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:28.965193 | orchestrator | 2026-04-07 03:54:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:28.965214 | orchestrator | 2026-04-07 03:54:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:32.016506 | orchestrator | 2026-04-07 03:54:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:32.018638 | orchestrator | 2026-04-07 03:54:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:32.018690 | orchestrator | 2026-04-07 03:54:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:35.064758 | orchestrator | 2026-04-07 03:54:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:35.067540 | orchestrator | 2026-04-07 03:54:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:35.067609 | orchestrator | 2026-04-07 03:54:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:38.117604 | orchestrator | 2026-04-07 03:54:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:38.118674 | orchestrator | 2026-04-07 03:54:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:38.118744 | orchestrator | 2026-04-07 03:54:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:41.167201 | orchestrator | 2026-04-07 03:54:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:41.169417 | orchestrator | 2026-04-07 03:54:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:41.169462 | orchestrator | 2026-04-07 03:54:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:44.215812 | orchestrator | 2026-04-07 03:54:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:44.217662 | orchestrator | 2026-04-07 03:54:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:44.217697 | orchestrator | 2026-04-07 03:54:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:47.258671 | orchestrator | 2026-04-07 03:54:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:47.261226 | orchestrator | 2026-04-07 03:54:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:47.261322 | orchestrator | 2026-04-07 03:54:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:50.312020 | orchestrator | 2026-04-07 03:54:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:50.314244 | orchestrator | 2026-04-07 03:54:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:50.314275 | orchestrator | 2026-04-07 03:54:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:53.357853 | orchestrator | 2026-04-07 03:54:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:53.358657 | orchestrator | 2026-04-07 03:54:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:53.358713 | orchestrator | 2026-04-07 03:54:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:56.396312 | orchestrator | 2026-04-07 03:54:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:56.399602 | orchestrator | 2026-04-07 03:54:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:56.399648 | orchestrator | 2026-04-07 03:54:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:54:59.444279 | orchestrator | 2026-04-07 03:54:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:54:59.445820 | orchestrator | 2026-04-07 03:54:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:54:59.445975 | orchestrator | 2026-04-07 03:54:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:02.498571 | orchestrator | 2026-04-07 03:55:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:02.500877 | orchestrator | 2026-04-07 03:55:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:02.500928 | orchestrator | 2026-04-07 03:55:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:05.550077 | orchestrator | 2026-04-07 03:55:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:05.551413 | orchestrator | 2026-04-07 03:55:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:05.551468 | orchestrator | 2026-04-07 03:55:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:08.607585 | orchestrator | 2026-04-07 03:55:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:08.608517 | orchestrator | 2026-04-07 03:55:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:08.608565 | orchestrator | 2026-04-07 03:55:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:11.655002 | orchestrator | 2026-04-07 03:55:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:11.656116 | orchestrator | 2026-04-07 03:55:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:11.656154 | orchestrator | 2026-04-07 03:55:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:14.704992 | orchestrator | 2026-04-07 03:55:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:14.708052 | orchestrator | 2026-04-07 03:55:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:14.708092 | orchestrator | 2026-04-07 03:55:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:17.758770 | orchestrator | 2026-04-07 03:55:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:17.760426 | orchestrator | 2026-04-07 03:55:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:17.760481 | orchestrator | 2026-04-07 03:55:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:20.809958 | orchestrator | 2026-04-07 03:55:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:20.810994 | orchestrator | 2026-04-07 03:55:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:20.811038 | orchestrator | 2026-04-07 03:55:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:23.853646 | orchestrator | 2026-04-07 03:55:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:23.856646 | orchestrator | 2026-04-07 03:55:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:23.856758 | orchestrator | 2026-04-07 03:55:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:26.906906 | orchestrator | 2026-04-07 03:55:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:26.908909 | orchestrator | 2026-04-07 03:55:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:26.908978 | orchestrator | 2026-04-07 03:55:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:29.952381 | orchestrator | 2026-04-07 03:55:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:29.953784 | orchestrator | 2026-04-07 03:55:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:29.953839 | orchestrator | 2026-04-07 03:55:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:32.999842 | orchestrator | 2026-04-07 03:55:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:33.004769 | orchestrator | 2026-04-07 03:55:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:33.004905 | orchestrator | 2026-04-07 03:55:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:36.056252 | orchestrator | 2026-04-07 03:55:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:36.058396 | orchestrator | 2026-04-07 03:55:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:36.058460 | orchestrator | 2026-04-07 03:55:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:39.099565 | orchestrator | 2026-04-07 03:55:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:39.102471 | orchestrator | 2026-04-07 03:55:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:39.102553 | orchestrator | 2026-04-07 03:55:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:42.166584 | orchestrator | 2026-04-07 03:55:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:42.167981 | orchestrator | 2026-04-07 03:55:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:42.168025 | orchestrator | 2026-04-07 03:55:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:45.211193 | orchestrator | 2026-04-07 03:55:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:45.212348 | orchestrator | 2026-04-07 03:55:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:45.212399 | orchestrator | 2026-04-07 03:55:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:48.260193 | orchestrator | 2026-04-07 03:55:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:48.264196 | orchestrator | 2026-04-07 03:55:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:48.264278 | orchestrator | 2026-04-07 03:55:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:51.311415 | orchestrator | 2026-04-07 03:55:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:51.312666 | orchestrator | 2026-04-07 03:55:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:51.312737 | orchestrator | 2026-04-07 03:55:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:54.367156 | orchestrator | 2026-04-07 03:55:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:54.368271 | orchestrator | 2026-04-07 03:55:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:54.368383 | orchestrator | 2026-04-07 03:55:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:55:57.421410 | orchestrator | 2026-04-07 03:55:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:55:57.421692 | orchestrator | 2026-04-07 03:55:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:55:57.422195 | orchestrator | 2026-04-07 03:55:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:00.459895 | orchestrator | 2026-04-07 03:56:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:00.460396 | orchestrator | 2026-04-07 03:56:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:00.460417 | orchestrator | 2026-04-07 03:56:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:03.514583 | orchestrator | 2026-04-07 03:56:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:03.516878 | orchestrator | 2026-04-07 03:56:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:03.516932 | orchestrator | 2026-04-07 03:56:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:06.563174 | orchestrator | 2026-04-07 03:56:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:06.565472 | orchestrator | 2026-04-07 03:56:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:06.565528 | orchestrator | 2026-04-07 03:56:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:09.611603 | orchestrator | 2026-04-07 03:56:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:09.612559 | orchestrator | 2026-04-07 03:56:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:09.612661 | orchestrator | 2026-04-07 03:56:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:12.658798 | orchestrator | 2026-04-07 03:56:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:12.661923 | orchestrator | 2026-04-07 03:56:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:12.662059 | orchestrator | 2026-04-07 03:56:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:15.714478 | orchestrator | 2026-04-07 03:56:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:15.717773 | orchestrator | 2026-04-07 03:56:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:15.717836 | orchestrator | 2026-04-07 03:56:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:18.767020 | orchestrator | 2026-04-07 03:56:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:18.769445 | orchestrator | 2026-04-07 03:56:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:18.769496 | orchestrator | 2026-04-07 03:56:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:21.820053 | orchestrator | 2026-04-07 03:56:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:21.821477 | orchestrator | 2026-04-07 03:56:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:21.821529 | orchestrator | 2026-04-07 03:56:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:24.858329 | orchestrator | 2026-04-07 03:56:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:24.860447 | orchestrator | 2026-04-07 03:56:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:24.860480 | orchestrator | 2026-04-07 03:56:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:27.900854 | orchestrator | 2026-04-07 03:56:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:27.903520 | orchestrator | 2026-04-07 03:56:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:27.903593 | orchestrator | 2026-04-07 03:56:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:30.951215 | orchestrator | 2026-04-07 03:56:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:30.952587 | orchestrator | 2026-04-07 03:56:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:30.952784 | orchestrator | 2026-04-07 03:56:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:33.994301 | orchestrator | 2026-04-07 03:56:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:33.995672 | orchestrator | 2026-04-07 03:56:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:33.995757 | orchestrator | 2026-04-07 03:56:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:37.038411 | orchestrator | 2026-04-07 03:56:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:37.040546 | orchestrator | 2026-04-07 03:56:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:37.040626 | orchestrator | 2026-04-07 03:56:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:40.078479 | orchestrator | 2026-04-07 03:56:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:40.080682 | orchestrator | 2026-04-07 03:56:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:40.080751 | orchestrator | 2026-04-07 03:56:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:43.126092 | orchestrator | 2026-04-07 03:56:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:43.128035 | orchestrator | 2026-04-07 03:56:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:43.128093 | orchestrator | 2026-04-07 03:56:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:46.173205 | orchestrator | 2026-04-07 03:56:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:46.174191 | orchestrator | 2026-04-07 03:56:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:46.174227 | orchestrator | 2026-04-07 03:56:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:49.223230 | orchestrator | 2026-04-07 03:56:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:49.224412 | orchestrator | 2026-04-07 03:56:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:49.224866 | orchestrator | 2026-04-07 03:56:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:52.266194 | orchestrator | 2026-04-07 03:56:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:52.267879 | orchestrator | 2026-04-07 03:56:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:52.267977 | orchestrator | 2026-04-07 03:56:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:55.302790 | orchestrator | 2026-04-07 03:56:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:55.303425 | orchestrator | 2026-04-07 03:56:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:55.303474 | orchestrator | 2026-04-07 03:56:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:56:58.346873 | orchestrator | 2026-04-07 03:56:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:56:58.348324 | orchestrator | 2026-04-07 03:56:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:56:58.348367 | orchestrator | 2026-04-07 03:56:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:01.387152 | orchestrator | 2026-04-07 03:57:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:01.387405 | orchestrator | 2026-04-07 03:57:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:01.387424 | orchestrator | 2026-04-07 03:57:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:04.428035 | orchestrator | 2026-04-07 03:57:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:04.429660 | orchestrator | 2026-04-07 03:57:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:04.429792 | orchestrator | 2026-04-07 03:57:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:07.485715 | orchestrator | 2026-04-07 03:57:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:07.489252 | orchestrator | 2026-04-07 03:57:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:07.489940 | orchestrator | 2026-04-07 03:57:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:10.545421 | orchestrator | 2026-04-07 03:57:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:10.546143 | orchestrator | 2026-04-07 03:57:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:10.546172 | orchestrator | 2026-04-07 03:57:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:13.590403 | orchestrator | 2026-04-07 03:57:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:13.594305 | orchestrator | 2026-04-07 03:57:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:13.594427 | orchestrator | 2026-04-07 03:57:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:16.644969 | orchestrator | 2026-04-07 03:57:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:16.646658 | orchestrator | 2026-04-07 03:57:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:16.646696 | orchestrator | 2026-04-07 03:57:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:19.696889 | orchestrator | 2026-04-07 03:57:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:19.698627 | orchestrator | 2026-04-07 03:57:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:19.698686 | orchestrator | 2026-04-07 03:57:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:22.739283 | orchestrator | 2026-04-07 03:57:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:22.741148 | orchestrator | 2026-04-07 03:57:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:22.741229 | orchestrator | 2026-04-07 03:57:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:25.780008 | orchestrator | 2026-04-07 03:57:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:25.783601 | orchestrator | 2026-04-07 03:57:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:25.784348 | orchestrator | 2026-04-07 03:57:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:28.830088 | orchestrator | 2026-04-07 03:57:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:28.831985 | orchestrator | 2026-04-07 03:57:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:28.832086 | orchestrator | 2026-04-07 03:57:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:31.879524 | orchestrator | 2026-04-07 03:57:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:31.880662 | orchestrator | 2026-04-07 03:57:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:31.880702 | orchestrator | 2026-04-07 03:57:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:34.931378 | orchestrator | 2026-04-07 03:57:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:34.933070 | orchestrator | 2026-04-07 03:57:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:34.933245 | orchestrator | 2026-04-07 03:57:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:37.983525 | orchestrator | 2026-04-07 03:57:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:37.988563 | orchestrator | 2026-04-07 03:57:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:37.988662 | orchestrator | 2026-04-07 03:57:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:41.038410 | orchestrator | 2026-04-07 03:57:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:41.039297 | orchestrator | 2026-04-07 03:57:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:41.039347 | orchestrator | 2026-04-07 03:57:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:44.080535 | orchestrator | 2026-04-07 03:57:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:44.082218 | orchestrator | 2026-04-07 03:57:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:44.082262 | orchestrator | 2026-04-07 03:57:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:47.128507 | orchestrator | 2026-04-07 03:57:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:47.133251 | orchestrator | 2026-04-07 03:57:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:47.133345 | orchestrator | 2026-04-07 03:57:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:50.176554 | orchestrator | 2026-04-07 03:57:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:50.177176 | orchestrator | 2026-04-07 03:57:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:50.177273 | orchestrator | 2026-04-07 03:57:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:53.228621 | orchestrator | 2026-04-07 03:57:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:53.232095 | orchestrator | 2026-04-07 03:57:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:53.232168 | orchestrator | 2026-04-07 03:57:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:56.267790 | orchestrator | 2026-04-07 03:57:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:56.269127 | orchestrator | 2026-04-07 03:57:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:56.269149 | orchestrator | 2026-04-07 03:57:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:57:59.305439 | orchestrator | 2026-04-07 03:57:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:57:59.306649 | orchestrator | 2026-04-07 03:57:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:57:59.306697 | orchestrator | 2026-04-07 03:57:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:02.357361 | orchestrator | 2026-04-07 03:58:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:02.359345 | orchestrator | 2026-04-07 03:58:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:02.359398 | orchestrator | 2026-04-07 03:58:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:05.408670 | orchestrator | 2026-04-07 03:58:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:05.410632 | orchestrator | 2026-04-07 03:58:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:05.410733 | orchestrator | 2026-04-07 03:58:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:08.463330 | orchestrator | 2026-04-07 03:58:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:08.464868 | orchestrator | 2026-04-07 03:58:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:08.464921 | orchestrator | 2026-04-07 03:58:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:11.505457 | orchestrator | 2026-04-07 03:58:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:11.506798 | orchestrator | 2026-04-07 03:58:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:11.506859 | orchestrator | 2026-04-07 03:58:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:14.548733 | orchestrator | 2026-04-07 03:58:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:14.552150 | orchestrator | 2026-04-07 03:58:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:14.552287 | orchestrator | 2026-04-07 03:58:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:17.598973 | orchestrator | 2026-04-07 03:58:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:17.600692 | orchestrator | 2026-04-07 03:58:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:17.600738 | orchestrator | 2026-04-07 03:58:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:20.644580 | orchestrator | 2026-04-07 03:58:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:20.646388 | orchestrator | 2026-04-07 03:58:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:20.646444 | orchestrator | 2026-04-07 03:58:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:23.695217 | orchestrator | 2026-04-07 03:58:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:23.696729 | orchestrator | 2026-04-07 03:58:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:23.696768 | orchestrator | 2026-04-07 03:58:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:26.744947 | orchestrator | 2026-04-07 03:58:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:26.747141 | orchestrator | 2026-04-07 03:58:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:26.747256 | orchestrator | 2026-04-07 03:58:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:29.790857 | orchestrator | 2026-04-07 03:58:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:29.792525 | orchestrator | 2026-04-07 03:58:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:29.792577 | orchestrator | 2026-04-07 03:58:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:32.835719 | orchestrator | 2026-04-07 03:58:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:32.837544 | orchestrator | 2026-04-07 03:58:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:32.837599 | orchestrator | 2026-04-07 03:58:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:35.884084 | orchestrator | 2026-04-07 03:58:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:35.885208 | orchestrator | 2026-04-07 03:58:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:35.885257 | orchestrator | 2026-04-07 03:58:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:38.927780 | orchestrator | 2026-04-07 03:58:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:38.928593 | orchestrator | 2026-04-07 03:58:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:38.928617 | orchestrator | 2026-04-07 03:58:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:41.979550 | orchestrator | 2026-04-07 03:58:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:41.981523 | orchestrator | 2026-04-07 03:58:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:41.981623 | orchestrator | 2026-04-07 03:58:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:45.032037 | orchestrator | 2026-04-07 03:58:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:45.033371 | orchestrator | 2026-04-07 03:58:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:45.033459 | orchestrator | 2026-04-07 03:58:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:48.079006 | orchestrator | 2026-04-07 03:58:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:48.079783 | orchestrator | 2026-04-07 03:58:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:48.079809 | orchestrator | 2026-04-07 03:58:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:51.126553 | orchestrator | 2026-04-07 03:58:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:51.128892 | orchestrator | 2026-04-07 03:58:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:51.128980 | orchestrator | 2026-04-07 03:58:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:54.166942 | orchestrator | 2026-04-07 03:58:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:54.168858 | orchestrator | 2026-04-07 03:58:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:54.168922 | orchestrator | 2026-04-07 03:58:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:58:57.214559 | orchestrator | 2026-04-07 03:58:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:58:57.214969 | orchestrator | 2026-04-07 03:58:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:58:57.215004 | orchestrator | 2026-04-07 03:58:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:00.261981 | orchestrator | 2026-04-07 03:59:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:00.262975 | orchestrator | 2026-04-07 03:59:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:00.263058 | orchestrator | 2026-04-07 03:59:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:03.313625 | orchestrator | 2026-04-07 03:59:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:03.314230 | orchestrator | 2026-04-07 03:59:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:03.314254 | orchestrator | 2026-04-07 03:59:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:06.354719 | orchestrator | 2026-04-07 03:59:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:06.357627 | orchestrator | 2026-04-07 03:59:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:06.357731 | orchestrator | 2026-04-07 03:59:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:09.407313 | orchestrator | 2026-04-07 03:59:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:09.410347 | orchestrator | 2026-04-07 03:59:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:09.410415 | orchestrator | 2026-04-07 03:59:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:12.454985 | orchestrator | 2026-04-07 03:59:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:12.457091 | orchestrator | 2026-04-07 03:59:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:12.457189 | orchestrator | 2026-04-07 03:59:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:15.497848 | orchestrator | 2026-04-07 03:59:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:15.499171 | orchestrator | 2026-04-07 03:59:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:15.499204 | orchestrator | 2026-04-07 03:59:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:18.540691 | orchestrator | 2026-04-07 03:59:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:18.542746 | orchestrator | 2026-04-07 03:59:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:18.542793 | orchestrator | 2026-04-07 03:59:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:21.587609 | orchestrator | 2026-04-07 03:59:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:21.588229 | orchestrator | 2026-04-07 03:59:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:21.588279 | orchestrator | 2026-04-07 03:59:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:24.635666 | orchestrator | 2026-04-07 03:59:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:24.638281 | orchestrator | 2026-04-07 03:59:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:24.638374 | orchestrator | 2026-04-07 03:59:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:27.681202 | orchestrator | 2026-04-07 03:59:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:27.683025 | orchestrator | 2026-04-07 03:59:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:27.683204 | orchestrator | 2026-04-07 03:59:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:30.724846 | orchestrator | 2026-04-07 03:59:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:30.726641 | orchestrator | 2026-04-07 03:59:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:30.726686 | orchestrator | 2026-04-07 03:59:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:33.772534 | orchestrator | 2026-04-07 03:59:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:33.775376 | orchestrator | 2026-04-07 03:59:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:33.775463 | orchestrator | 2026-04-07 03:59:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:36.825875 | orchestrator | 2026-04-07 03:59:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:36.829433 | orchestrator | 2026-04-07 03:59:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:36.829510 | orchestrator | 2026-04-07 03:59:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:39.873574 | orchestrator | 2026-04-07 03:59:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:39.876324 | orchestrator | 2026-04-07 03:59:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:39.876393 | orchestrator | 2026-04-07 03:59:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:42.927310 | orchestrator | 2026-04-07 03:59:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:42.928746 | orchestrator | 2026-04-07 03:59:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:42.928813 | orchestrator | 2026-04-07 03:59:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:45.985970 | orchestrator | 2026-04-07 03:59:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:45.986413 | orchestrator | 2026-04-07 03:59:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:45.986480 | orchestrator | 2026-04-07 03:59:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:49.032943 | orchestrator | 2026-04-07 03:59:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:49.035737 | orchestrator | 2026-04-07 03:59:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:49.035790 | orchestrator | 2026-04-07 03:59:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:52.084372 | orchestrator | 2026-04-07 03:59:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:52.087183 | orchestrator | 2026-04-07 03:59:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:52.087258 | orchestrator | 2026-04-07 03:59:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:55.129518 | orchestrator | 2026-04-07 03:59:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:55.131706 | orchestrator | 2026-04-07 03:59:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:55.131770 | orchestrator | 2026-04-07 03:59:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 03:59:58.182996 | orchestrator | 2026-04-07 03:59:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 03:59:58.184247 | orchestrator | 2026-04-07 03:59:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 03:59:58.184290 | orchestrator | 2026-04-07 03:59:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:01.225834 | orchestrator | 2026-04-07 04:00:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:01.225986 | orchestrator | 2026-04-07 04:00:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:01.226002 | orchestrator | 2026-04-07 04:00:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:04.269347 | orchestrator | 2026-04-07 04:00:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:04.271678 | orchestrator | 2026-04-07 04:00:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:04.271780 | orchestrator | 2026-04-07 04:00:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:07.325491 | orchestrator | 2026-04-07 04:00:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:07.326523 | orchestrator | 2026-04-07 04:00:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:07.326586 | orchestrator | 2026-04-07 04:00:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:10.374234 | orchestrator | 2026-04-07 04:00:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:10.375066 | orchestrator | 2026-04-07 04:00:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:10.375164 | orchestrator | 2026-04-07 04:00:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:13.429656 | orchestrator | 2026-04-07 04:00:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:13.433271 | orchestrator | 2026-04-07 04:00:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:13.433334 | orchestrator | 2026-04-07 04:00:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:16.477953 | orchestrator | 2026-04-07 04:00:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:16.479967 | orchestrator | 2026-04-07 04:00:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:16.480050 | orchestrator | 2026-04-07 04:00:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:19.536019 | orchestrator | 2026-04-07 04:00:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:19.539196 | orchestrator | 2026-04-07 04:00:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:19.539284 | orchestrator | 2026-04-07 04:00:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:22.579061 | orchestrator | 2026-04-07 04:00:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:22.580157 | orchestrator | 2026-04-07 04:00:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:22.580681 | orchestrator | 2026-04-07 04:00:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:25.637360 | orchestrator | 2026-04-07 04:00:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:25.640751 | orchestrator | 2026-04-07 04:00:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:25.640866 | orchestrator | 2026-04-07 04:00:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:28.692667 | orchestrator | 2026-04-07 04:00:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:28.695118 | orchestrator | 2026-04-07 04:00:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:28.695185 | orchestrator | 2026-04-07 04:00:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:31.743574 | orchestrator | 2026-04-07 04:00:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:31.747271 | orchestrator | 2026-04-07 04:00:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:31.747734 | orchestrator | 2026-04-07 04:00:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:34.803750 | orchestrator | 2026-04-07 04:00:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:34.806422 | orchestrator | 2026-04-07 04:00:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:34.806489 | orchestrator | 2026-04-07 04:00:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:37.856924 | orchestrator | 2026-04-07 04:00:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:37.858610 | orchestrator | 2026-04-07 04:00:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:37.858686 | orchestrator | 2026-04-07 04:00:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:40.907346 | orchestrator | 2026-04-07 04:00:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:40.909355 | orchestrator | 2026-04-07 04:00:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:40.909449 | orchestrator | 2026-04-07 04:00:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:43.959922 | orchestrator | 2026-04-07 04:00:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:43.962652 | orchestrator | 2026-04-07 04:00:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:43.962723 | orchestrator | 2026-04-07 04:00:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:47.017262 | orchestrator | 2026-04-07 04:00:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:47.019139 | orchestrator | 2026-04-07 04:00:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:47.019211 | orchestrator | 2026-04-07 04:00:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:50.059766 | orchestrator | 2026-04-07 04:00:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:50.062654 | orchestrator | 2026-04-07 04:00:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:50.062825 | orchestrator | 2026-04-07 04:00:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:53.108602 | orchestrator | 2026-04-07 04:00:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:53.109764 | orchestrator | 2026-04-07 04:00:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:53.109813 | orchestrator | 2026-04-07 04:00:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:56.156381 | orchestrator | 2026-04-07 04:00:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:56.158156 | orchestrator | 2026-04-07 04:00:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:56.158234 | orchestrator | 2026-04-07 04:00:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:00:59.197476 | orchestrator | 2026-04-07 04:00:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:00:59.199114 | orchestrator | 2026-04-07 04:00:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:00:59.199158 | orchestrator | 2026-04-07 04:00:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:02.241615 | orchestrator | 2026-04-07 04:01:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:02.244018 | orchestrator | 2026-04-07 04:01:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:02.244133 | orchestrator | 2026-04-07 04:01:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:05.287079 | orchestrator | 2026-04-07 04:01:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:05.288585 | orchestrator | 2026-04-07 04:01:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:05.288614 | orchestrator | 2026-04-07 04:01:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:08.343978 | orchestrator | 2026-04-07 04:01:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:08.346306 | orchestrator | 2026-04-07 04:01:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:08.346986 | orchestrator | 2026-04-07 04:01:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:11.395512 | orchestrator | 2026-04-07 04:01:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:11.397299 | orchestrator | 2026-04-07 04:01:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:11.397348 | orchestrator | 2026-04-07 04:01:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:14.437974 | orchestrator | 2026-04-07 04:01:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:14.440273 | orchestrator | 2026-04-07 04:01:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:14.440388 | orchestrator | 2026-04-07 04:01:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:17.487537 | orchestrator | 2026-04-07 04:01:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:17.489401 | orchestrator | 2026-04-07 04:01:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:17.489443 | orchestrator | 2026-04-07 04:01:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:20.529801 | orchestrator | 2026-04-07 04:01:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:20.530520 | orchestrator | 2026-04-07 04:01:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:20.530701 | orchestrator | 2026-04-07 04:01:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:23.576882 | orchestrator | 2026-04-07 04:01:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:23.578812 | orchestrator | 2026-04-07 04:01:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:23.578861 | orchestrator | 2026-04-07 04:01:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:26.629858 | orchestrator | 2026-04-07 04:01:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:26.631089 | orchestrator | 2026-04-07 04:01:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:26.631218 | orchestrator | 2026-04-07 04:01:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:29.682338 | orchestrator | 2026-04-07 04:01:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:29.683397 | orchestrator | 2026-04-07 04:01:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:29.683443 | orchestrator | 2026-04-07 04:01:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:32.732809 | orchestrator | 2026-04-07 04:01:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:32.734513 | orchestrator | 2026-04-07 04:01:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:32.734592 | orchestrator | 2026-04-07 04:01:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:35.785691 | orchestrator | 2026-04-07 04:01:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:35.787124 | orchestrator | 2026-04-07 04:01:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:35.787184 | orchestrator | 2026-04-07 04:01:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:38.828074 | orchestrator | 2026-04-07 04:01:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:38.829451 | orchestrator | 2026-04-07 04:01:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:38.829519 | orchestrator | 2026-04-07 04:01:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:41.874316 | orchestrator | 2026-04-07 04:01:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:41.875289 | orchestrator | 2026-04-07 04:01:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:41.875378 | orchestrator | 2026-04-07 04:01:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:44.917072 | orchestrator | 2026-04-07 04:01:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:44.919473 | orchestrator | 2026-04-07 04:01:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:44.919749 | orchestrator | 2026-04-07 04:01:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:47.969739 | orchestrator | 2026-04-07 04:01:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:47.970909 | orchestrator | 2026-04-07 04:01:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:47.970954 | orchestrator | 2026-04-07 04:01:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:51.016439 | orchestrator | 2026-04-07 04:01:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:51.017778 | orchestrator | 2026-04-07 04:01:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:51.017828 | orchestrator | 2026-04-07 04:01:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:54.059481 | orchestrator | 2026-04-07 04:01:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:54.061537 | orchestrator | 2026-04-07 04:01:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:54.061620 | orchestrator | 2026-04-07 04:01:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:01:57.108674 | orchestrator | 2026-04-07 04:01:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:01:57.110603 | orchestrator | 2026-04-07 04:01:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:01:57.110678 | orchestrator | 2026-04-07 04:01:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:00.146559 | orchestrator | 2026-04-07 04:02:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:00.147683 | orchestrator | 2026-04-07 04:02:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:00.147738 | orchestrator | 2026-04-07 04:02:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:03.191149 | orchestrator | 2026-04-07 04:02:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:03.191661 | orchestrator | 2026-04-07 04:02:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:03.191680 | orchestrator | 2026-04-07 04:02:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:06.229098 | orchestrator | 2026-04-07 04:02:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:06.231105 | orchestrator | 2026-04-07 04:02:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:06.231170 | orchestrator | 2026-04-07 04:02:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:09.279554 | orchestrator | 2026-04-07 04:02:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:09.281741 | orchestrator | 2026-04-07 04:02:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:09.281816 | orchestrator | 2026-04-07 04:02:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:12.325835 | orchestrator | 2026-04-07 04:02:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:12.327012 | orchestrator | 2026-04-07 04:02:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:12.327155 | orchestrator | 2026-04-07 04:02:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:15.364717 | orchestrator | 2026-04-07 04:02:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:15.365448 | orchestrator | 2026-04-07 04:02:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:15.365658 | orchestrator | 2026-04-07 04:02:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:18.401272 | orchestrator | 2026-04-07 04:02:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:18.402728 | orchestrator | 2026-04-07 04:02:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:18.402804 | orchestrator | 2026-04-07 04:02:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:21.442520 | orchestrator | 2026-04-07 04:02:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:21.443536 | orchestrator | 2026-04-07 04:02:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:21.443563 | orchestrator | 2026-04-07 04:02:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:24.495702 | orchestrator | 2026-04-07 04:02:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:24.497830 | orchestrator | 2026-04-07 04:02:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:24.497889 | orchestrator | 2026-04-07 04:02:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:27.535919 | orchestrator | 2026-04-07 04:02:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:27.538246 | orchestrator | 2026-04-07 04:02:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:27.538302 | orchestrator | 2026-04-07 04:02:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:30.577262 | orchestrator | 2026-04-07 04:02:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:30.578765 | orchestrator | 2026-04-07 04:02:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:30.578931 | orchestrator | 2026-04-07 04:02:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:33.628280 | orchestrator | 2026-04-07 04:02:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:33.628606 | orchestrator | 2026-04-07 04:02:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:33.628683 | orchestrator | 2026-04-07 04:02:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:36.672752 | orchestrator | 2026-04-07 04:02:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:36.676315 | orchestrator | 2026-04-07 04:02:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:36.676461 | orchestrator | 2026-04-07 04:02:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:39.722864 | orchestrator | 2026-04-07 04:02:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:39.724165 | orchestrator | 2026-04-07 04:02:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:39.724229 | orchestrator | 2026-04-07 04:02:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:42.761873 | orchestrator | 2026-04-07 04:02:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:42.763438 | orchestrator | 2026-04-07 04:02:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:42.763463 | orchestrator | 2026-04-07 04:02:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:45.806140 | orchestrator | 2026-04-07 04:02:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:45.807625 | orchestrator | 2026-04-07 04:02:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:45.807989 | orchestrator | 2026-04-07 04:02:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:48.848502 | orchestrator | 2026-04-07 04:02:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:48.851170 | orchestrator | 2026-04-07 04:02:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:48.851213 | orchestrator | 2026-04-07 04:02:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:51.896379 | orchestrator | 2026-04-07 04:02:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:51.898829 | orchestrator | 2026-04-07 04:02:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:51.898907 | orchestrator | 2026-04-07 04:02:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:54.935633 | orchestrator | 2026-04-07 04:02:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:54.936184 | orchestrator | 2026-04-07 04:02:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:54.936346 | orchestrator | 2026-04-07 04:02:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:02:57.985891 | orchestrator | 2026-04-07 04:02:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:02:57.987779 | orchestrator | 2026-04-07 04:02:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:02:57.987878 | orchestrator | 2026-04-07 04:02:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:01.045340 | orchestrator | 2026-04-07 04:03:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:01.047288 | orchestrator | 2026-04-07 04:03:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:01.047384 | orchestrator | 2026-04-07 04:03:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:04.085780 | orchestrator | 2026-04-07 04:03:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:04.087479 | orchestrator | 2026-04-07 04:03:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:04.087525 | orchestrator | 2026-04-07 04:03:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:07.134428 | orchestrator | 2026-04-07 04:03:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:07.136484 | orchestrator | 2026-04-07 04:03:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:07.136569 | orchestrator | 2026-04-07 04:03:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:10.178208 | orchestrator | 2026-04-07 04:03:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:10.179606 | orchestrator | 2026-04-07 04:03:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:10.179656 | orchestrator | 2026-04-07 04:03:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:13.237327 | orchestrator | 2026-04-07 04:03:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:13.240602 | orchestrator | 2026-04-07 04:03:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:13.240713 | orchestrator | 2026-04-07 04:03:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:16.280927 | orchestrator | 2026-04-07 04:03:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:16.284255 | orchestrator | 2026-04-07 04:03:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:16.284336 | orchestrator | 2026-04-07 04:03:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:19.332534 | orchestrator | 2026-04-07 04:03:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:19.335482 | orchestrator | 2026-04-07 04:03:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:19.335523 | orchestrator | 2026-04-07 04:03:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:22.384110 | orchestrator | 2026-04-07 04:03:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:22.385256 | orchestrator | 2026-04-07 04:03:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:22.385309 | orchestrator | 2026-04-07 04:03:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:25.429054 | orchestrator | 2026-04-07 04:03:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:25.430720 | orchestrator | 2026-04-07 04:03:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:25.430836 | orchestrator | 2026-04-07 04:03:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:28.475772 | orchestrator | 2026-04-07 04:03:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:28.477842 | orchestrator | 2026-04-07 04:03:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:28.477881 | orchestrator | 2026-04-07 04:03:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:31.512376 | orchestrator | 2026-04-07 04:03:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:31.514641 | orchestrator | 2026-04-07 04:03:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:31.514716 | orchestrator | 2026-04-07 04:03:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:34.561358 | orchestrator | 2026-04-07 04:03:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:34.562827 | orchestrator | 2026-04-07 04:03:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:34.562885 | orchestrator | 2026-04-07 04:03:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:37.616793 | orchestrator | 2026-04-07 04:03:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:37.618476 | orchestrator | 2026-04-07 04:03:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:37.618525 | orchestrator | 2026-04-07 04:03:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:40.663051 | orchestrator | 2026-04-07 04:03:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:40.664066 | orchestrator | 2026-04-07 04:03:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:40.664109 | orchestrator | 2026-04-07 04:03:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:43.706791 | orchestrator | 2026-04-07 04:03:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:43.712427 | orchestrator | 2026-04-07 04:03:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:43.712499 | orchestrator | 2026-04-07 04:03:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:46.766463 | orchestrator | 2026-04-07 04:03:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:46.768447 | orchestrator | 2026-04-07 04:03:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:46.768501 | orchestrator | 2026-04-07 04:03:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:49.821099 | orchestrator | 2026-04-07 04:03:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:49.822370 | orchestrator | 2026-04-07 04:03:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:49.822443 | orchestrator | 2026-04-07 04:03:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:52.878363 | orchestrator | 2026-04-07 04:03:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:52.879398 | orchestrator | 2026-04-07 04:03:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:52.879445 | orchestrator | 2026-04-07 04:03:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:55.938286 | orchestrator | 2026-04-07 04:03:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:55.939941 | orchestrator | 2026-04-07 04:03:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:55.939981 | orchestrator | 2026-04-07 04:03:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:03:58.982981 | orchestrator | 2026-04-07 04:03:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:03:58.983251 | orchestrator | 2026-04-07 04:03:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:03:58.983278 | orchestrator | 2026-04-07 04:03:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:02.028370 | orchestrator | 2026-04-07 04:04:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:02.030512 | orchestrator | 2026-04-07 04:04:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:02.030578 | orchestrator | 2026-04-07 04:04:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:05.072968 | orchestrator | 2026-04-07 04:04:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:05.075250 | orchestrator | 2026-04-07 04:04:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:05.075316 | orchestrator | 2026-04-07 04:04:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:08.122603 | orchestrator | 2026-04-07 04:04:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:08.123414 | orchestrator | 2026-04-07 04:04:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:08.123442 | orchestrator | 2026-04-07 04:04:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:11.176702 | orchestrator | 2026-04-07 04:04:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:11.178710 | orchestrator | 2026-04-07 04:04:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:11.178770 | orchestrator | 2026-04-07 04:04:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:14.235409 | orchestrator | 2026-04-07 04:04:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:14.236005 | orchestrator | 2026-04-07 04:04:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:14.236053 | orchestrator | 2026-04-07 04:04:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:17.281413 | orchestrator | 2026-04-07 04:04:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:17.283488 | orchestrator | 2026-04-07 04:04:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:17.283540 | orchestrator | 2026-04-07 04:04:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:20.332025 | orchestrator | 2026-04-07 04:04:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:20.334304 | orchestrator | 2026-04-07 04:04:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:20.334365 | orchestrator | 2026-04-07 04:04:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:23.380467 | orchestrator | 2026-04-07 04:04:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:23.381092 | orchestrator | 2026-04-07 04:04:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:23.381165 | orchestrator | 2026-04-07 04:04:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:26.424773 | orchestrator | 2026-04-07 04:04:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:26.426709 | orchestrator | 2026-04-07 04:04:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:26.426803 | orchestrator | 2026-04-07 04:04:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:29.475456 | orchestrator | 2026-04-07 04:04:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:29.478122 | orchestrator | 2026-04-07 04:04:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:29.478186 | orchestrator | 2026-04-07 04:04:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:32.520382 | orchestrator | 2026-04-07 04:04:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:32.521260 | orchestrator | 2026-04-07 04:04:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:32.521293 | orchestrator | 2026-04-07 04:04:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:35.572772 | orchestrator | 2026-04-07 04:04:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:35.573793 | orchestrator | 2026-04-07 04:04:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:35.574007 | orchestrator | 2026-04-07 04:04:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:38.621514 | orchestrator | 2026-04-07 04:04:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:38.623373 | orchestrator | 2026-04-07 04:04:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:38.623499 | orchestrator | 2026-04-07 04:04:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:41.663702 | orchestrator | 2026-04-07 04:04:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:41.664950 | orchestrator | 2026-04-07 04:04:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:41.664991 | orchestrator | 2026-04-07 04:04:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:44.705951 | orchestrator | 2026-04-07 04:04:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:44.707130 | orchestrator | 2026-04-07 04:04:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:44.707178 | orchestrator | 2026-04-07 04:04:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:47.753993 | orchestrator | 2026-04-07 04:04:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:47.755458 | orchestrator | 2026-04-07 04:04:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:47.755520 | orchestrator | 2026-04-07 04:04:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:50.798337 | orchestrator | 2026-04-07 04:04:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:50.800016 | orchestrator | 2026-04-07 04:04:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:50.800088 | orchestrator | 2026-04-07 04:04:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:53.854995 | orchestrator | 2026-04-07 04:04:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:53.856781 | orchestrator | 2026-04-07 04:04:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:53.856879 | orchestrator | 2026-04-07 04:04:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:56.900698 | orchestrator | 2026-04-07 04:04:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:56.902464 | orchestrator | 2026-04-07 04:04:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:56.902542 | orchestrator | 2026-04-07 04:04:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:04:59.952141 | orchestrator | 2026-04-07 04:04:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:04:59.955141 | orchestrator | 2026-04-07 04:04:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:04:59.955211 | orchestrator | 2026-04-07 04:04:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:03.000250 | orchestrator | 2026-04-07 04:05:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:03.003635 | orchestrator | 2026-04-07 04:05:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:03.003732 | orchestrator | 2026-04-07 04:05:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:06.052836 | orchestrator | 2026-04-07 04:05:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:06.054431 | orchestrator | 2026-04-07 04:05:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:06.054491 | orchestrator | 2026-04-07 04:05:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:09.097860 | orchestrator | 2026-04-07 04:05:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:09.100272 | orchestrator | 2026-04-07 04:05:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:09.100320 | orchestrator | 2026-04-07 04:05:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:12.141780 | orchestrator | 2026-04-07 04:05:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:12.144263 | orchestrator | 2026-04-07 04:05:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:12.144329 | orchestrator | 2026-04-07 04:05:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:15.191314 | orchestrator | 2026-04-07 04:05:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:15.193190 | orchestrator | 2026-04-07 04:05:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:15.193252 | orchestrator | 2026-04-07 04:05:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:18.231547 | orchestrator | 2026-04-07 04:05:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:18.232578 | orchestrator | 2026-04-07 04:05:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:18.232619 | orchestrator | 2026-04-07 04:05:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:21.275563 | orchestrator | 2026-04-07 04:05:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:21.278549 | orchestrator | 2026-04-07 04:05:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:21.278610 | orchestrator | 2026-04-07 04:05:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:24.327958 | orchestrator | 2026-04-07 04:05:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:24.330148 | orchestrator | 2026-04-07 04:05:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:24.330206 | orchestrator | 2026-04-07 04:05:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:27.378168 | orchestrator | 2026-04-07 04:05:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:27.380002 | orchestrator | 2026-04-07 04:05:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:27.380135 | orchestrator | 2026-04-07 04:05:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:30.423468 | orchestrator | 2026-04-07 04:05:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:30.424429 | orchestrator | 2026-04-07 04:05:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:30.424482 | orchestrator | 2026-04-07 04:05:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:33.476365 | orchestrator | 2026-04-07 04:05:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:33.477748 | orchestrator | 2026-04-07 04:05:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:33.477815 | orchestrator | 2026-04-07 04:05:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:36.518300 | orchestrator | 2026-04-07 04:05:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:36.521027 | orchestrator | 2026-04-07 04:05:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:36.521085 | orchestrator | 2026-04-07 04:05:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:39.571995 | orchestrator | 2026-04-07 04:05:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:39.574289 | orchestrator | 2026-04-07 04:05:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:39.574371 | orchestrator | 2026-04-07 04:05:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:42.617926 | orchestrator | 2026-04-07 04:05:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:42.621642 | orchestrator | 2026-04-07 04:05:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:42.621753 | orchestrator | 2026-04-07 04:05:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:45.679375 | orchestrator | 2026-04-07 04:05:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:45.681555 | orchestrator | 2026-04-07 04:05:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:45.681601 | orchestrator | 2026-04-07 04:05:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:48.737218 | orchestrator | 2026-04-07 04:05:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:48.739279 | orchestrator | 2026-04-07 04:05:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:48.739321 | orchestrator | 2026-04-07 04:05:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:51.782148 | orchestrator | 2026-04-07 04:05:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:51.784056 | orchestrator | 2026-04-07 04:05:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:51.784118 | orchestrator | 2026-04-07 04:05:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:54.823318 | orchestrator | 2026-04-07 04:05:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:54.823911 | orchestrator | 2026-04-07 04:05:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:54.824163 | orchestrator | 2026-04-07 04:05:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:05:57.865132 | orchestrator | 2026-04-07 04:05:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:05:57.865562 | orchestrator | 2026-04-07 04:05:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:05:57.865629 | orchestrator | 2026-04-07 04:05:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:00.916636 | orchestrator | 2026-04-07 04:06:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:00.917680 | orchestrator | 2026-04-07 04:06:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:00.917704 | orchestrator | 2026-04-07 04:06:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:03.980157 | orchestrator | 2026-04-07 04:06:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:03.980282 | orchestrator | 2026-04-07 04:06:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:03.980294 | orchestrator | 2026-04-07 04:06:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:07.040053 | orchestrator | 2026-04-07 04:06:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:07.041888 | orchestrator | 2026-04-07 04:06:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:07.041955 | orchestrator | 2026-04-07 04:06:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:10.088212 | orchestrator | 2026-04-07 04:06:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:10.090073 | orchestrator | 2026-04-07 04:06:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:10.090154 | orchestrator | 2026-04-07 04:06:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:13.136740 | orchestrator | 2026-04-07 04:06:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:13.137589 | orchestrator | 2026-04-07 04:06:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:13.137619 | orchestrator | 2026-04-07 04:06:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:16.194603 | orchestrator | 2026-04-07 04:06:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:16.196841 | orchestrator | 2026-04-07 04:06:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:16.196894 | orchestrator | 2026-04-07 04:06:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:19.246208 | orchestrator | 2026-04-07 04:06:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:19.247228 | orchestrator | 2026-04-07 04:06:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:19.247282 | orchestrator | 2026-04-07 04:06:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:22.293622 | orchestrator | 2026-04-07 04:06:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:22.296211 | orchestrator | 2026-04-07 04:06:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:22.296468 | orchestrator | 2026-04-07 04:06:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:25.344160 | orchestrator | 2026-04-07 04:06:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:25.345934 | orchestrator | 2026-04-07 04:06:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:25.345971 | orchestrator | 2026-04-07 04:06:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:28.390334 | orchestrator | 2026-04-07 04:06:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:28.392650 | orchestrator | 2026-04-07 04:06:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:28.392697 | orchestrator | 2026-04-07 04:06:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:31.439463 | orchestrator | 2026-04-07 04:06:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:31.440272 | orchestrator | 2026-04-07 04:06:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:31.440347 | orchestrator | 2026-04-07 04:06:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:34.484501 | orchestrator | 2026-04-07 04:06:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:34.486395 | orchestrator | 2026-04-07 04:06:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:34.486599 | orchestrator | 2026-04-07 04:06:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:37.531225 | orchestrator | 2026-04-07 04:06:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:37.533111 | orchestrator | 2026-04-07 04:06:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:37.533167 | orchestrator | 2026-04-07 04:06:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:40.580204 | orchestrator | 2026-04-07 04:06:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:40.582130 | orchestrator | 2026-04-07 04:06:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:40.582185 | orchestrator | 2026-04-07 04:06:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:43.628085 | orchestrator | 2026-04-07 04:06:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:43.630166 | orchestrator | 2026-04-07 04:06:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:43.630249 | orchestrator | 2026-04-07 04:06:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:46.674450 | orchestrator | 2026-04-07 04:06:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:46.676687 | orchestrator | 2026-04-07 04:06:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:46.676733 | orchestrator | 2026-04-07 04:06:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:49.721503 | orchestrator | 2026-04-07 04:06:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:49.722654 | orchestrator | 2026-04-07 04:06:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:49.722721 | orchestrator | 2026-04-07 04:06:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:52.765032 | orchestrator | 2026-04-07 04:06:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:52.766610 | orchestrator | 2026-04-07 04:06:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:52.766647 | orchestrator | 2026-04-07 04:06:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:55.815365 | orchestrator | 2026-04-07 04:06:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:55.815960 | orchestrator | 2026-04-07 04:06:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:55.816164 | orchestrator | 2026-04-07 04:06:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:06:58.865428 | orchestrator | 2026-04-07 04:06:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:06:58.866434 | orchestrator | 2026-04-07 04:06:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:06:58.866495 | orchestrator | 2026-04-07 04:06:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:01.917355 | orchestrator | 2026-04-07 04:07:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:01.918698 | orchestrator | 2026-04-07 04:07:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:01.918782 | orchestrator | 2026-04-07 04:07:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:04.965402 | orchestrator | 2026-04-07 04:07:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:04.967206 | orchestrator | 2026-04-07 04:07:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:04.967275 | orchestrator | 2026-04-07 04:07:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:08.008027 | orchestrator | 2026-04-07 04:07:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:08.010650 | orchestrator | 2026-04-07 04:07:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:08.010739 | orchestrator | 2026-04-07 04:07:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:11.047250 | orchestrator | 2026-04-07 04:07:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:11.048494 | orchestrator | 2026-04-07 04:07:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:11.048547 | orchestrator | 2026-04-07 04:07:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:14.087320 | orchestrator | 2026-04-07 04:07:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:14.088436 | orchestrator | 2026-04-07 04:07:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:14.088487 | orchestrator | 2026-04-07 04:07:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:17.132411 | orchestrator | 2026-04-07 04:07:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:17.134344 | orchestrator | 2026-04-07 04:07:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:17.134469 | orchestrator | 2026-04-07 04:07:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:20.181371 | orchestrator | 2026-04-07 04:07:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:20.183062 | orchestrator | 2026-04-07 04:07:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:20.183153 | orchestrator | 2026-04-07 04:07:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:23.231309 | orchestrator | 2026-04-07 04:07:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:23.234437 | orchestrator | 2026-04-07 04:07:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:23.234523 | orchestrator | 2026-04-07 04:07:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:26.268934 | orchestrator | 2026-04-07 04:07:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:26.269710 | orchestrator | 2026-04-07 04:07:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:26.269807 | orchestrator | 2026-04-07 04:07:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:29.315305 | orchestrator | 2026-04-07 04:07:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:29.318634 | orchestrator | 2026-04-07 04:07:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:29.318809 | orchestrator | 2026-04-07 04:07:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:32.366185 | orchestrator | 2026-04-07 04:07:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:32.367947 | orchestrator | 2026-04-07 04:07:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:32.367997 | orchestrator | 2026-04-07 04:07:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:35.418355 | orchestrator | 2026-04-07 04:07:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:35.418982 | orchestrator | 2026-04-07 04:07:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:35.419033 | orchestrator | 2026-04-07 04:07:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:38.461589 | orchestrator | 2026-04-07 04:07:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:38.463836 | orchestrator | 2026-04-07 04:07:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:38.463896 | orchestrator | 2026-04-07 04:07:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:41.510136 | orchestrator | 2026-04-07 04:07:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:41.510343 | orchestrator | 2026-04-07 04:07:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:41.510366 | orchestrator | 2026-04-07 04:07:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:44.545910 | orchestrator | 2026-04-07 04:07:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:44.547584 | orchestrator | 2026-04-07 04:07:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:44.547641 | orchestrator | 2026-04-07 04:07:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:47.584165 | orchestrator | 2026-04-07 04:07:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:47.585409 | orchestrator | 2026-04-07 04:07:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:47.585476 | orchestrator | 2026-04-07 04:07:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:50.625033 | orchestrator | 2026-04-07 04:07:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:50.625781 | orchestrator | 2026-04-07 04:07:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:50.625813 | orchestrator | 2026-04-07 04:07:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:53.672049 | orchestrator | 2026-04-07 04:07:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:53.673467 | orchestrator | 2026-04-07 04:07:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:53.673515 | orchestrator | 2026-04-07 04:07:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:56.721061 | orchestrator | 2026-04-07 04:07:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:56.723070 | orchestrator | 2026-04-07 04:07:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:56.723167 | orchestrator | 2026-04-07 04:07:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:07:59.780098 | orchestrator | 2026-04-07 04:07:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:07:59.783504 | orchestrator | 2026-04-07 04:07:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:07:59.783594 | orchestrator | 2026-04-07 04:07:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:02.825830 | orchestrator | 2026-04-07 04:08:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:02.828990 | orchestrator | 2026-04-07 04:08:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:02.829070 | orchestrator | 2026-04-07 04:08:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:05.876761 | orchestrator | 2026-04-07 04:08:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:05.879153 | orchestrator | 2026-04-07 04:08:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:05.879221 | orchestrator | 2026-04-07 04:08:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:08.919636 | orchestrator | 2026-04-07 04:08:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:08.921200 | orchestrator | 2026-04-07 04:08:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:08.921256 | orchestrator | 2026-04-07 04:08:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:11.971281 | orchestrator | 2026-04-07 04:08:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:11.973062 | orchestrator | 2026-04-07 04:08:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:11.973135 | orchestrator | 2026-04-07 04:08:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:15.025273 | orchestrator | 2026-04-07 04:08:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:15.026772 | orchestrator | 2026-04-07 04:08:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:15.026815 | orchestrator | 2026-04-07 04:08:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:18.074789 | orchestrator | 2026-04-07 04:08:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:18.075110 | orchestrator | 2026-04-07 04:08:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:18.075211 | orchestrator | 2026-04-07 04:08:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:21.117830 | orchestrator | 2026-04-07 04:08:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:21.119189 | orchestrator | 2026-04-07 04:08:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:21.119230 | orchestrator | 2026-04-07 04:08:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:24.156092 | orchestrator | 2026-04-07 04:08:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:24.156210 | orchestrator | 2026-04-07 04:08:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:24.156220 | orchestrator | 2026-04-07 04:08:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:27.196375 | orchestrator | 2026-04-07 04:08:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:27.197877 | orchestrator | 2026-04-07 04:08:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:27.197932 | orchestrator | 2026-04-07 04:08:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:30.231369 | orchestrator | 2026-04-07 04:08:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:30.232615 | orchestrator | 2026-04-07 04:08:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:30.232709 | orchestrator | 2026-04-07 04:08:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:33.285243 | orchestrator | 2026-04-07 04:08:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:33.286661 | orchestrator | 2026-04-07 04:08:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:33.286843 | orchestrator | 2026-04-07 04:08:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:36.333098 | orchestrator | 2026-04-07 04:08:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:36.334918 | orchestrator | 2026-04-07 04:08:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:36.334966 | orchestrator | 2026-04-07 04:08:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:39.386980 | orchestrator | 2026-04-07 04:08:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:39.389537 | orchestrator | 2026-04-07 04:08:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:39.389604 | orchestrator | 2026-04-07 04:08:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:42.428231 | orchestrator | 2026-04-07 04:08:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:42.428816 | orchestrator | 2026-04-07 04:08:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:42.428842 | orchestrator | 2026-04-07 04:08:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:45.479119 | orchestrator | 2026-04-07 04:08:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:45.480878 | orchestrator | 2026-04-07 04:08:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:45.480990 | orchestrator | 2026-04-07 04:08:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:08:48.531409 | orchestrator | 2026-04-07 04:08:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:08:48.532884 | orchestrator | 2026-04-07 04:08:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:08:48.532927 | orchestrator | 2026-04-07 04:08:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:10:51.678901 | orchestrator | 2026-04-07 04:10:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:10:51.679030 | orchestrator | 2026-04-07 04:10:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:10:51.679045 | orchestrator | 2026-04-07 04:10:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:10:54.719399 | orchestrator | 2026-04-07 04:10:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:10:54.720931 | orchestrator | 2026-04-07 04:10:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:10:54.721069 | orchestrator | 2026-04-07 04:10:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:10:57.763740 | orchestrator | 2026-04-07 04:10:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:10:57.766426 | orchestrator | 2026-04-07 04:10:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:10:57.766627 | orchestrator | 2026-04-07 04:10:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:00.807114 | orchestrator | 2026-04-07 04:11:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:00.808025 | orchestrator | 2026-04-07 04:11:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:00.808095 | orchestrator | 2026-04-07 04:11:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:03.853141 | orchestrator | 2026-04-07 04:11:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:03.853848 | orchestrator | 2026-04-07 04:11:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:03.853907 | orchestrator | 2026-04-07 04:11:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:06.900404 | orchestrator | 2026-04-07 04:11:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:06.900661 | orchestrator | 2026-04-07 04:11:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:06.900686 | orchestrator | 2026-04-07 04:11:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:09.947383 | orchestrator | 2026-04-07 04:11:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:09.948358 | orchestrator | 2026-04-07 04:11:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:09.948399 | orchestrator | 2026-04-07 04:11:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:12.992730 | orchestrator | 2026-04-07 04:11:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:12.995503 | orchestrator | 2026-04-07 04:11:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:12.995584 | orchestrator | 2026-04-07 04:11:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:16.045915 | orchestrator | 2026-04-07 04:11:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:16.047149 | orchestrator | 2026-04-07 04:11:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:16.047425 | orchestrator | 2026-04-07 04:11:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:19.085244 | orchestrator | 2026-04-07 04:11:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:19.086142 | orchestrator | 2026-04-07 04:11:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:19.086184 | orchestrator | 2026-04-07 04:11:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:22.133281 | orchestrator | 2026-04-07 04:11:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:22.136922 | orchestrator | 2026-04-07 04:11:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:22.136996 | orchestrator | 2026-04-07 04:11:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:25.175292 | orchestrator | 2026-04-07 04:11:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:25.177426 | orchestrator | 2026-04-07 04:11:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:25.177570 | orchestrator | 2026-04-07 04:11:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:28.213289 | orchestrator | 2026-04-07 04:11:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:28.214493 | orchestrator | 2026-04-07 04:11:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:28.214552 | orchestrator | 2026-04-07 04:11:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:31.259817 | orchestrator | 2026-04-07 04:11:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:31.261499 | orchestrator | 2026-04-07 04:11:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:31.261551 | orchestrator | 2026-04-07 04:11:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:34.317098 | orchestrator | 2026-04-07 04:11:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:34.319677 | orchestrator | 2026-04-07 04:11:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:34.319782 | orchestrator | 2026-04-07 04:11:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:37.359984 | orchestrator | 2026-04-07 04:11:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:37.361313 | orchestrator | 2026-04-07 04:11:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:37.361346 | orchestrator | 2026-04-07 04:11:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:40.411774 | orchestrator | 2026-04-07 04:11:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:40.412973 | orchestrator | 2026-04-07 04:11:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:40.413020 | orchestrator | 2026-04-07 04:11:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:43.458521 | orchestrator | 2026-04-07 04:11:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:43.460152 | orchestrator | 2026-04-07 04:11:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:43.460218 | orchestrator | 2026-04-07 04:11:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:46.498620 | orchestrator | 2026-04-07 04:11:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:46.500225 | orchestrator | 2026-04-07 04:11:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:46.500395 | orchestrator | 2026-04-07 04:11:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:49.550314 | orchestrator | 2026-04-07 04:11:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:49.555395 | orchestrator | 2026-04-07 04:11:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:49.555593 | orchestrator | 2026-04-07 04:11:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:52.590850 | orchestrator | 2026-04-07 04:11:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:52.591975 | orchestrator | 2026-04-07 04:11:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:52.592023 | orchestrator | 2026-04-07 04:11:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:55.635483 | orchestrator | 2026-04-07 04:11:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:55.637065 | orchestrator | 2026-04-07 04:11:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:55.637111 | orchestrator | 2026-04-07 04:11:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:11:58.676550 | orchestrator | 2026-04-07 04:11:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:11:58.678662 | orchestrator | 2026-04-07 04:11:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:11:58.678725 | orchestrator | 2026-04-07 04:11:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:01.722620 | orchestrator | 2026-04-07 04:12:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:01.724883 | orchestrator | 2026-04-07 04:12:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:01.724940 | orchestrator | 2026-04-07 04:12:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:04.760063 | orchestrator | 2026-04-07 04:12:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:04.763517 | orchestrator | 2026-04-07 04:12:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:04.763594 | orchestrator | 2026-04-07 04:12:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:07.798658 | orchestrator | 2026-04-07 04:12:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:07.799366 | orchestrator | 2026-04-07 04:12:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:07.799455 | orchestrator | 2026-04-07 04:12:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:10.849666 | orchestrator | 2026-04-07 04:12:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:10.851702 | orchestrator | 2026-04-07 04:12:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:10.851756 | orchestrator | 2026-04-07 04:12:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:13.905888 | orchestrator | 2026-04-07 04:12:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:13.906296 | orchestrator | 2026-04-07 04:12:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:13.906329 | orchestrator | 2026-04-07 04:12:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:16.950948 | orchestrator | 2026-04-07 04:12:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:16.952757 | orchestrator | 2026-04-07 04:12:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:16.952822 | orchestrator | 2026-04-07 04:12:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:20.006832 | orchestrator | 2026-04-07 04:12:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:20.009840 | orchestrator | 2026-04-07 04:12:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:20.009954 | orchestrator | 2026-04-07 04:12:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:23.054368 | orchestrator | 2026-04-07 04:12:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:23.056272 | orchestrator | 2026-04-07 04:12:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:23.056321 | orchestrator | 2026-04-07 04:12:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:26.103919 | orchestrator | 2026-04-07 04:12:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:26.106523 | orchestrator | 2026-04-07 04:12:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:26.106696 | orchestrator | 2026-04-07 04:12:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:29.149742 | orchestrator | 2026-04-07 04:12:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:29.151533 | orchestrator | 2026-04-07 04:12:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:29.151580 | orchestrator | 2026-04-07 04:12:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:32.193896 | orchestrator | 2026-04-07 04:12:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:32.195239 | orchestrator | 2026-04-07 04:12:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:32.195307 | orchestrator | 2026-04-07 04:12:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:35.233815 | orchestrator | 2026-04-07 04:12:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:35.235915 | orchestrator | 2026-04-07 04:12:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:35.235973 | orchestrator | 2026-04-07 04:12:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:38.279528 | orchestrator | 2026-04-07 04:12:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:38.280496 | orchestrator | 2026-04-07 04:12:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:38.280576 | orchestrator | 2026-04-07 04:12:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:41.317981 | orchestrator | 2026-04-07 04:12:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:41.320609 | orchestrator | 2026-04-07 04:12:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:41.320688 | orchestrator | 2026-04-07 04:12:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:44.357925 | orchestrator | 2026-04-07 04:12:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:44.358286 | orchestrator | 2026-04-07 04:12:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:44.358302 | orchestrator | 2026-04-07 04:12:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:47.400199 | orchestrator | 2026-04-07 04:12:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:47.402634 | orchestrator | 2026-04-07 04:12:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:47.402726 | orchestrator | 2026-04-07 04:12:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:50.457741 | orchestrator | 2026-04-07 04:12:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:50.458284 | orchestrator | 2026-04-07 04:12:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:50.458314 | orchestrator | 2026-04-07 04:12:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:53.510722 | orchestrator | 2026-04-07 04:12:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:53.512574 | orchestrator | 2026-04-07 04:12:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:53.512654 | orchestrator | 2026-04-07 04:12:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:56.557557 | orchestrator | 2026-04-07 04:12:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:56.558719 | orchestrator | 2026-04-07 04:12:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:56.558821 | orchestrator | 2026-04-07 04:12:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:12:59.603618 | orchestrator | 2026-04-07 04:12:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:12:59.604693 | orchestrator | 2026-04-07 04:12:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:12:59.604735 | orchestrator | 2026-04-07 04:12:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:02.650796 | orchestrator | 2026-04-07 04:13:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:02.653255 | orchestrator | 2026-04-07 04:13:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:02.653306 | orchestrator | 2026-04-07 04:13:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:05.708678 | orchestrator | 2026-04-07 04:13:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:05.711851 | orchestrator | 2026-04-07 04:13:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:05.711955 | orchestrator | 2026-04-07 04:13:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:08.767969 | orchestrator | 2026-04-07 04:13:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:08.774738 | orchestrator | 2026-04-07 04:13:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:08.774859 | orchestrator | 2026-04-07 04:13:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:11.822396 | orchestrator | 2026-04-07 04:13:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:11.823733 | orchestrator | 2026-04-07 04:13:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:11.823779 | orchestrator | 2026-04-07 04:13:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:14.871846 | orchestrator | 2026-04-07 04:13:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:14.873317 | orchestrator | 2026-04-07 04:13:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:14.873449 | orchestrator | 2026-04-07 04:13:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:17.907118 | orchestrator | 2026-04-07 04:13:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:17.907978 | orchestrator | 2026-04-07 04:13:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:17.908027 | orchestrator | 2026-04-07 04:13:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:20.962089 | orchestrator | 2026-04-07 04:13:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:20.964535 | orchestrator | 2026-04-07 04:13:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:20.964591 | orchestrator | 2026-04-07 04:13:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:24.017166 | orchestrator | 2026-04-07 04:13:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:24.018571 | orchestrator | 2026-04-07 04:13:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:24.018616 | orchestrator | 2026-04-07 04:13:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:27.064259 | orchestrator | 2026-04-07 04:13:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:27.065671 | orchestrator | 2026-04-07 04:13:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:27.065895 | orchestrator | 2026-04-07 04:13:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:30.112581 | orchestrator | 2026-04-07 04:13:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:30.114347 | orchestrator | 2026-04-07 04:13:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:30.114394 | orchestrator | 2026-04-07 04:13:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:33.157633 | orchestrator | 2026-04-07 04:13:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:33.159676 | orchestrator | 2026-04-07 04:13:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:33.159725 | orchestrator | 2026-04-07 04:13:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:36.200070 | orchestrator | 2026-04-07 04:13:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:36.201575 | orchestrator | 2026-04-07 04:13:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:36.201630 | orchestrator | 2026-04-07 04:13:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:39.244875 | orchestrator | 2026-04-07 04:13:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:39.246662 | orchestrator | 2026-04-07 04:13:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:39.246734 | orchestrator | 2026-04-07 04:13:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:42.288975 | orchestrator | 2026-04-07 04:13:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:42.290432 | orchestrator | 2026-04-07 04:13:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:42.290597 | orchestrator | 2026-04-07 04:13:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:45.327607 | orchestrator | 2026-04-07 04:13:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:45.328987 | orchestrator | 2026-04-07 04:13:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:45.329117 | orchestrator | 2026-04-07 04:13:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:48.376961 | orchestrator | 2026-04-07 04:13:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:48.377470 | orchestrator | 2026-04-07 04:13:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:48.377502 | orchestrator | 2026-04-07 04:13:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:51.421838 | orchestrator | 2026-04-07 04:13:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:51.423820 | orchestrator | 2026-04-07 04:13:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:51.423891 | orchestrator | 2026-04-07 04:13:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:54.466083 | orchestrator | 2026-04-07 04:13:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:54.466934 | orchestrator | 2026-04-07 04:13:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:54.466971 | orchestrator | 2026-04-07 04:13:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:13:57.507943 | orchestrator | 2026-04-07 04:13:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:13:57.509063 | orchestrator | 2026-04-07 04:13:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:13:57.509099 | orchestrator | 2026-04-07 04:13:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:00.544202 | orchestrator | 2026-04-07 04:14:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:00.545253 | orchestrator | 2026-04-07 04:14:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:00.545320 | orchestrator | 2026-04-07 04:14:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:03.593880 | orchestrator | 2026-04-07 04:14:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:03.595158 | orchestrator | 2026-04-07 04:14:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:03.595200 | orchestrator | 2026-04-07 04:14:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:06.647724 | orchestrator | 2026-04-07 04:14:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:06.651797 | orchestrator | 2026-04-07 04:14:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:06.651871 | orchestrator | 2026-04-07 04:14:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:09.703189 | orchestrator | 2026-04-07 04:14:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:09.707040 | orchestrator | 2026-04-07 04:14:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:09.707165 | orchestrator | 2026-04-07 04:14:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:12.753917 | orchestrator | 2026-04-07 04:14:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:12.757635 | orchestrator | 2026-04-07 04:14:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:12.757708 | orchestrator | 2026-04-07 04:14:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:15.802349 | orchestrator | 2026-04-07 04:14:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:15.803824 | orchestrator | 2026-04-07 04:14:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:15.803891 | orchestrator | 2026-04-07 04:14:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:18.848788 | orchestrator | 2026-04-07 04:14:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:18.849438 | orchestrator | 2026-04-07 04:14:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:18.849455 | orchestrator | 2026-04-07 04:14:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:21.900750 | orchestrator | 2026-04-07 04:14:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:21.902176 | orchestrator | 2026-04-07 04:14:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:21.902246 | orchestrator | 2026-04-07 04:14:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:24.945467 | orchestrator | 2026-04-07 04:14:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:24.947301 | orchestrator | 2026-04-07 04:14:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:24.947607 | orchestrator | 2026-04-07 04:14:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:27.988953 | orchestrator | 2026-04-07 04:14:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:27.991457 | orchestrator | 2026-04-07 04:14:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:27.991535 | orchestrator | 2026-04-07 04:14:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:31.040374 | orchestrator | 2026-04-07 04:14:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:31.042179 | orchestrator | 2026-04-07 04:14:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:31.042228 | orchestrator | 2026-04-07 04:14:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:34.086583 | orchestrator | 2026-04-07 04:14:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:34.087857 | orchestrator | 2026-04-07 04:14:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:34.088029 | orchestrator | 2026-04-07 04:14:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:37.132030 | orchestrator | 2026-04-07 04:14:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:37.134375 | orchestrator | 2026-04-07 04:14:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:37.134463 | orchestrator | 2026-04-07 04:14:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:40.177780 | orchestrator | 2026-04-07 04:14:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:40.178781 | orchestrator | 2026-04-07 04:14:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:40.178857 | orchestrator | 2026-04-07 04:14:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:43.226223 | orchestrator | 2026-04-07 04:14:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:43.229138 | orchestrator | 2026-04-07 04:14:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:43.229202 | orchestrator | 2026-04-07 04:14:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:46.279795 | orchestrator | 2026-04-07 04:14:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:46.281370 | orchestrator | 2026-04-07 04:14:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:46.281435 | orchestrator | 2026-04-07 04:14:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:49.325921 | orchestrator | 2026-04-07 04:14:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:49.327561 | orchestrator | 2026-04-07 04:14:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:49.327620 | orchestrator | 2026-04-07 04:14:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:52.378765 | orchestrator | 2026-04-07 04:14:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:52.381012 | orchestrator | 2026-04-07 04:14:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:52.381066 | orchestrator | 2026-04-07 04:14:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:55.431867 | orchestrator | 2026-04-07 04:14:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:55.434577 | orchestrator | 2026-04-07 04:14:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:55.434635 | orchestrator | 2026-04-07 04:14:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:14:58.480748 | orchestrator | 2026-04-07 04:14:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:14:58.481560 | orchestrator | 2026-04-07 04:14:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:14:58.481931 | orchestrator | 2026-04-07 04:14:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:01.529116 | orchestrator | 2026-04-07 04:15:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:01.530982 | orchestrator | 2026-04-07 04:15:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:01.531105 | orchestrator | 2026-04-07 04:15:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:04.575606 | orchestrator | 2026-04-07 04:15:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:04.578348 | orchestrator | 2026-04-07 04:15:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:04.578392 | orchestrator | 2026-04-07 04:15:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:07.630376 | orchestrator | 2026-04-07 04:15:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:07.631692 | orchestrator | 2026-04-07 04:15:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:07.631728 | orchestrator | 2026-04-07 04:15:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:10.674957 | orchestrator | 2026-04-07 04:15:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:10.676727 | orchestrator | 2026-04-07 04:15:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:10.676778 | orchestrator | 2026-04-07 04:15:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:13.731275 | orchestrator | 2026-04-07 04:15:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:13.734170 | orchestrator | 2026-04-07 04:15:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:13.734570 | orchestrator | 2026-04-07 04:15:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:16.780251 | orchestrator | 2026-04-07 04:15:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:16.782264 | orchestrator | 2026-04-07 04:15:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:16.782344 | orchestrator | 2026-04-07 04:15:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:19.828377 | orchestrator | 2026-04-07 04:15:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:19.829504 | orchestrator | 2026-04-07 04:15:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:19.829546 | orchestrator | 2026-04-07 04:15:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:22.878398 | orchestrator | 2026-04-07 04:15:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:22.880543 | orchestrator | 2026-04-07 04:15:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:22.880594 | orchestrator | 2026-04-07 04:15:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:25.933125 | orchestrator | 2026-04-07 04:15:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:25.934763 | orchestrator | 2026-04-07 04:15:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:25.934811 | orchestrator | 2026-04-07 04:15:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:28.985331 | orchestrator | 2026-04-07 04:15:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:28.987028 | orchestrator | 2026-04-07 04:15:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:28.987335 | orchestrator | 2026-04-07 04:15:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:32.035595 | orchestrator | 2026-04-07 04:15:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:32.037060 | orchestrator | 2026-04-07 04:15:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:32.037153 | orchestrator | 2026-04-07 04:15:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:35.076006 | orchestrator | 2026-04-07 04:15:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:35.077425 | orchestrator | 2026-04-07 04:15:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:35.077475 | orchestrator | 2026-04-07 04:15:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:38.132674 | orchestrator | 2026-04-07 04:15:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:38.134644 | orchestrator | 2026-04-07 04:15:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:38.134711 | orchestrator | 2026-04-07 04:15:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:41.189950 | orchestrator | 2026-04-07 04:15:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:41.191328 | orchestrator | 2026-04-07 04:15:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:41.191381 | orchestrator | 2026-04-07 04:15:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:44.241590 | orchestrator | 2026-04-07 04:15:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:44.243543 | orchestrator | 2026-04-07 04:15:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:44.243664 | orchestrator | 2026-04-07 04:15:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:47.288120 | orchestrator | 2026-04-07 04:15:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:47.290722 | orchestrator | 2026-04-07 04:15:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:47.290893 | orchestrator | 2026-04-07 04:15:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:50.338586 | orchestrator | 2026-04-07 04:15:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:50.340309 | orchestrator | 2026-04-07 04:15:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:50.340397 | orchestrator | 2026-04-07 04:15:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:53.386731 | orchestrator | 2026-04-07 04:15:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:53.388983 | orchestrator | 2026-04-07 04:15:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:53.389074 | orchestrator | 2026-04-07 04:15:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:56.443756 | orchestrator | 2026-04-07 04:15:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:56.445747 | orchestrator | 2026-04-07 04:15:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:56.445790 | orchestrator | 2026-04-07 04:15:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:15:59.496112 | orchestrator | 2026-04-07 04:15:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:15:59.497580 | orchestrator | 2026-04-07 04:15:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:15:59.497618 | orchestrator | 2026-04-07 04:15:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:02.542571 | orchestrator | 2026-04-07 04:16:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:02.544420 | orchestrator | 2026-04-07 04:16:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:02.544476 | orchestrator | 2026-04-07 04:16:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:05.594090 | orchestrator | 2026-04-07 04:16:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:05.595577 | orchestrator | 2026-04-07 04:16:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:05.595652 | orchestrator | 2026-04-07 04:16:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:08.648137 | orchestrator | 2026-04-07 04:16:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:08.649276 | orchestrator | 2026-04-07 04:16:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:08.649407 | orchestrator | 2026-04-07 04:16:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:11.695200 | orchestrator | 2026-04-07 04:16:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:11.697572 | orchestrator | 2026-04-07 04:16:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:11.697653 | orchestrator | 2026-04-07 04:16:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:14.739439 | orchestrator | 2026-04-07 04:16:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:14.740846 | orchestrator | 2026-04-07 04:16:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:14.740892 | orchestrator | 2026-04-07 04:16:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:17.794682 | orchestrator | 2026-04-07 04:16:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:17.796608 | orchestrator | 2026-04-07 04:16:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:17.796840 | orchestrator | 2026-04-07 04:16:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:20.849522 | orchestrator | 2026-04-07 04:16:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:20.850468 | orchestrator | 2026-04-07 04:16:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:20.850569 | orchestrator | 2026-04-07 04:16:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:23.900924 | orchestrator | 2026-04-07 04:16:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:23.902731 | orchestrator | 2026-04-07 04:16:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:23.902770 | orchestrator | 2026-04-07 04:16:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:26.951548 | orchestrator | 2026-04-07 04:16:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:26.953688 | orchestrator | 2026-04-07 04:16:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:26.953758 | orchestrator | 2026-04-07 04:16:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:30.004498 | orchestrator | 2026-04-07 04:16:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:30.007798 | orchestrator | 2026-04-07 04:16:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:30.007878 | orchestrator | 2026-04-07 04:16:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:33.057982 | orchestrator | 2026-04-07 04:16:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:33.059559 | orchestrator | 2026-04-07 04:16:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:33.059631 | orchestrator | 2026-04-07 04:16:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:36.106875 | orchestrator | 2026-04-07 04:16:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:36.107999 | orchestrator | 2026-04-07 04:16:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:36.108417 | orchestrator | 2026-04-07 04:16:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:39.157697 | orchestrator | 2026-04-07 04:16:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:39.164106 | orchestrator | 2026-04-07 04:16:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:39.164167 | orchestrator | 2026-04-07 04:16:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:42.213970 | orchestrator | 2026-04-07 04:16:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:42.216075 | orchestrator | 2026-04-07 04:16:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:42.216106 | orchestrator | 2026-04-07 04:16:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:45.260109 | orchestrator | 2026-04-07 04:16:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:45.262129 | orchestrator | 2026-04-07 04:16:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:45.262184 | orchestrator | 2026-04-07 04:16:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:48.309466 | orchestrator | 2026-04-07 04:16:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:48.311076 | orchestrator | 2026-04-07 04:16:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:48.311143 | orchestrator | 2026-04-07 04:16:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:51.353558 | orchestrator | 2026-04-07 04:16:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:51.355666 | orchestrator | 2026-04-07 04:16:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:51.355856 | orchestrator | 2026-04-07 04:16:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:54.406259 | orchestrator | 2026-04-07 04:16:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:54.408227 | orchestrator | 2026-04-07 04:16:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:54.408299 | orchestrator | 2026-04-07 04:16:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:16:57.456161 | orchestrator | 2026-04-07 04:16:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:16:57.457844 | orchestrator | 2026-04-07 04:16:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:16:57.457921 | orchestrator | 2026-04-07 04:16:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:00.504383 | orchestrator | 2026-04-07 04:17:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:00.506266 | orchestrator | 2026-04-07 04:17:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:00.506329 | orchestrator | 2026-04-07 04:17:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:03.558760 | orchestrator | 2026-04-07 04:17:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:03.560531 | orchestrator | 2026-04-07 04:17:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:03.560699 | orchestrator | 2026-04-07 04:17:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:06.610100 | orchestrator | 2026-04-07 04:17:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:06.612185 | orchestrator | 2026-04-07 04:17:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:06.612262 | orchestrator | 2026-04-07 04:17:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:09.662130 | orchestrator | 2026-04-07 04:17:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:09.663061 | orchestrator | 2026-04-07 04:17:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:09.663223 | orchestrator | 2026-04-07 04:17:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:12.708339 | orchestrator | 2026-04-07 04:17:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:12.709526 | orchestrator | 2026-04-07 04:17:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:12.709607 | orchestrator | 2026-04-07 04:17:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:15.759390 | orchestrator | 2026-04-07 04:17:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:15.762252 | orchestrator | 2026-04-07 04:17:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:15.762365 | orchestrator | 2026-04-07 04:17:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:18.810573 | orchestrator | 2026-04-07 04:17:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:18.812710 | orchestrator | 2026-04-07 04:17:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:18.812889 | orchestrator | 2026-04-07 04:17:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:21.857234 | orchestrator | 2026-04-07 04:17:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:21.859554 | orchestrator | 2026-04-07 04:17:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:21.859656 | orchestrator | 2026-04-07 04:17:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:24.911617 | orchestrator | 2026-04-07 04:17:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:24.913753 | orchestrator | 2026-04-07 04:17:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:24.913881 | orchestrator | 2026-04-07 04:17:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:27.953789 | orchestrator | 2026-04-07 04:17:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:27.955073 | orchestrator | 2026-04-07 04:17:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:27.955159 | orchestrator | 2026-04-07 04:17:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:31.004730 | orchestrator | 2026-04-07 04:17:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:31.007120 | orchestrator | 2026-04-07 04:17:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:31.007973 | orchestrator | 2026-04-07 04:17:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:34.050694 | orchestrator | 2026-04-07 04:17:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:34.052125 | orchestrator | 2026-04-07 04:17:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:34.052254 | orchestrator | 2026-04-07 04:17:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:37.107780 | orchestrator | 2026-04-07 04:17:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:37.110141 | orchestrator | 2026-04-07 04:17:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:37.110770 | orchestrator | 2026-04-07 04:17:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:40.164969 | orchestrator | 2026-04-07 04:17:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:40.166990 | orchestrator | 2026-04-07 04:17:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:40.167044 | orchestrator | 2026-04-07 04:17:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:43.214503 | orchestrator | 2026-04-07 04:17:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:43.216806 | orchestrator | 2026-04-07 04:17:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:43.216881 | orchestrator | 2026-04-07 04:17:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:46.266691 | orchestrator | 2026-04-07 04:17:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:46.268638 | orchestrator | 2026-04-07 04:17:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:46.268679 | orchestrator | 2026-04-07 04:17:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:49.323396 | orchestrator | 2026-04-07 04:17:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:49.325473 | orchestrator | 2026-04-07 04:17:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:49.325728 | orchestrator | 2026-04-07 04:17:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:52.369039 | orchestrator | 2026-04-07 04:17:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:52.370442 | orchestrator | 2026-04-07 04:17:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:52.370555 | orchestrator | 2026-04-07 04:17:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:55.418506 | orchestrator | 2026-04-07 04:17:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:55.421516 | orchestrator | 2026-04-07 04:17:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:55.421581 | orchestrator | 2026-04-07 04:17:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:17:58.469116 | orchestrator | 2026-04-07 04:17:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:17:58.470387 | orchestrator | 2026-04-07 04:17:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:17:58.470432 | orchestrator | 2026-04-07 04:17:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:01.517600 | orchestrator | 2026-04-07 04:18:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:01.519077 | orchestrator | 2026-04-07 04:18:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:01.519158 | orchestrator | 2026-04-07 04:18:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:04.566839 | orchestrator | 2026-04-07 04:18:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:04.568537 | orchestrator | 2026-04-07 04:18:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:04.568593 | orchestrator | 2026-04-07 04:18:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:07.618730 | orchestrator | 2026-04-07 04:18:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:07.622712 | orchestrator | 2026-04-07 04:18:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:07.622788 | orchestrator | 2026-04-07 04:18:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:10.668695 | orchestrator | 2026-04-07 04:18:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:10.670465 | orchestrator | 2026-04-07 04:18:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:10.670497 | orchestrator | 2026-04-07 04:18:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:13.717385 | orchestrator | 2026-04-07 04:18:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:13.719333 | orchestrator | 2026-04-07 04:18:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:13.719397 | orchestrator | 2026-04-07 04:18:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:16.765886 | orchestrator | 2026-04-07 04:18:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:16.768163 | orchestrator | 2026-04-07 04:18:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:16.768228 | orchestrator | 2026-04-07 04:18:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:19.813974 | orchestrator | 2026-04-07 04:18:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:19.815064 | orchestrator | 2026-04-07 04:18:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:19.815194 | orchestrator | 2026-04-07 04:18:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:22.861997 | orchestrator | 2026-04-07 04:18:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:22.864490 | orchestrator | 2026-04-07 04:18:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:22.864577 | orchestrator | 2026-04-07 04:18:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:25.917067 | orchestrator | 2026-04-07 04:18:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:25.919755 | orchestrator | 2026-04-07 04:18:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:25.919793 | orchestrator | 2026-04-07 04:18:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:28.972053 | orchestrator | 2026-04-07 04:18:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:28.974193 | orchestrator | 2026-04-07 04:18:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:28.974333 | orchestrator | 2026-04-07 04:18:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:32.027147 | orchestrator | 2026-04-07 04:18:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:32.028282 | orchestrator | 2026-04-07 04:18:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:32.028525 | orchestrator | 2026-04-07 04:18:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:35.075200 | orchestrator | 2026-04-07 04:18:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:35.076769 | orchestrator | 2026-04-07 04:18:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:35.076821 | orchestrator | 2026-04-07 04:18:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:38.126469 | orchestrator | 2026-04-07 04:18:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:38.129167 | orchestrator | 2026-04-07 04:18:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:38.129242 | orchestrator | 2026-04-07 04:18:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:41.174930 | orchestrator | 2026-04-07 04:18:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:41.177428 | orchestrator | 2026-04-07 04:18:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:41.177483 | orchestrator | 2026-04-07 04:18:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:44.222801 | orchestrator | 2026-04-07 04:18:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:44.224227 | orchestrator | 2026-04-07 04:18:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:44.224254 | orchestrator | 2026-04-07 04:18:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:47.274157 | orchestrator | 2026-04-07 04:18:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:47.277107 | orchestrator | 2026-04-07 04:18:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:47.277245 | orchestrator | 2026-04-07 04:18:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:50.322967 | orchestrator | 2026-04-07 04:18:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:50.323814 | orchestrator | 2026-04-07 04:18:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:50.323860 | orchestrator | 2026-04-07 04:18:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:53.377036 | orchestrator | 2026-04-07 04:18:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:53.379532 | orchestrator | 2026-04-07 04:18:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:53.379596 | orchestrator | 2026-04-07 04:18:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:56.428293 | orchestrator | 2026-04-07 04:18:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:56.430633 | orchestrator | 2026-04-07 04:18:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:56.430695 | orchestrator | 2026-04-07 04:18:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:18:59.483361 | orchestrator | 2026-04-07 04:18:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:18:59.483819 | orchestrator | 2026-04-07 04:18:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:18:59.483848 | orchestrator | 2026-04-07 04:18:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:02.537505 | orchestrator | 2026-04-07 04:19:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:02.538660 | orchestrator | 2026-04-07 04:19:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:02.538726 | orchestrator | 2026-04-07 04:19:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:05.589309 | orchestrator | 2026-04-07 04:19:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:05.591842 | orchestrator | 2026-04-07 04:19:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:05.591901 | orchestrator | 2026-04-07 04:19:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:08.643353 | orchestrator | 2026-04-07 04:19:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:08.644603 | orchestrator | 2026-04-07 04:19:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:08.644663 | orchestrator | 2026-04-07 04:19:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:11.692709 | orchestrator | 2026-04-07 04:19:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:11.693726 | orchestrator | 2026-04-07 04:19:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:11.693769 | orchestrator | 2026-04-07 04:19:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:14.742332 | orchestrator | 2026-04-07 04:19:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:14.743840 | orchestrator | 2026-04-07 04:19:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:14.743940 | orchestrator | 2026-04-07 04:19:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:17.790461 | orchestrator | 2026-04-07 04:19:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:17.792161 | orchestrator | 2026-04-07 04:19:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:17.792380 | orchestrator | 2026-04-07 04:19:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:20.842832 | orchestrator | 2026-04-07 04:19:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:20.844619 | orchestrator | 2026-04-07 04:19:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:20.844696 | orchestrator | 2026-04-07 04:19:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:23.894978 | orchestrator | 2026-04-07 04:19:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:23.896828 | orchestrator | 2026-04-07 04:19:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:23.896881 | orchestrator | 2026-04-07 04:19:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:26.951411 | orchestrator | 2026-04-07 04:19:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:26.954358 | orchestrator | 2026-04-07 04:19:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:26.954444 | orchestrator | 2026-04-07 04:19:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:30.004271 | orchestrator | 2026-04-07 04:19:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:30.006860 | orchestrator | 2026-04-07 04:19:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:30.006955 | orchestrator | 2026-04-07 04:19:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:33.056008 | orchestrator | 2026-04-07 04:19:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:33.058291 | orchestrator | 2026-04-07 04:19:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:33.058378 | orchestrator | 2026-04-07 04:19:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:36.102101 | orchestrator | 2026-04-07 04:19:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:36.103928 | orchestrator | 2026-04-07 04:19:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:36.103995 | orchestrator | 2026-04-07 04:19:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:39.148772 | orchestrator | 2026-04-07 04:19:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:39.149514 | orchestrator | 2026-04-07 04:19:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:39.149719 | orchestrator | 2026-04-07 04:19:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:42.198499 | orchestrator | 2026-04-07 04:19:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:42.200181 | orchestrator | 2026-04-07 04:19:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:42.200215 | orchestrator | 2026-04-07 04:19:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:45.246380 | orchestrator | 2026-04-07 04:19:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:45.248135 | orchestrator | 2026-04-07 04:19:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:45.248184 | orchestrator | 2026-04-07 04:19:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:48.293716 | orchestrator | 2026-04-07 04:19:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:48.294915 | orchestrator | 2026-04-07 04:19:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:48.294958 | orchestrator | 2026-04-07 04:19:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:51.333009 | orchestrator | 2026-04-07 04:19:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:51.334506 | orchestrator | 2026-04-07 04:19:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:51.334567 | orchestrator | 2026-04-07 04:19:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:54.369187 | orchestrator | 2026-04-07 04:19:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:54.371027 | orchestrator | 2026-04-07 04:19:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:54.371122 | orchestrator | 2026-04-07 04:19:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:19:57.415467 | orchestrator | 2026-04-07 04:19:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:19:57.417685 | orchestrator | 2026-04-07 04:19:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:19:57.417856 | orchestrator | 2026-04-07 04:19:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:00.455349 | orchestrator | 2026-04-07 04:20:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:00.457408 | orchestrator | 2026-04-07 04:20:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:00.457461 | orchestrator | 2026-04-07 04:20:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:03.496485 | orchestrator | 2026-04-07 04:20:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:03.498112 | orchestrator | 2026-04-07 04:20:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:03.498206 | orchestrator | 2026-04-07 04:20:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:06.539843 | orchestrator | 2026-04-07 04:20:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:06.540310 | orchestrator | 2026-04-07 04:20:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:06.540371 | orchestrator | 2026-04-07 04:20:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:09.587864 | orchestrator | 2026-04-07 04:20:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:09.588927 | orchestrator | 2026-04-07 04:20:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:09.588959 | orchestrator | 2026-04-07 04:20:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:12.634708 | orchestrator | 2026-04-07 04:20:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:12.636461 | orchestrator | 2026-04-07 04:20:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:12.636492 | orchestrator | 2026-04-07 04:20:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:15.688131 | orchestrator | 2026-04-07 04:20:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:15.690312 | orchestrator | 2026-04-07 04:20:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:15.690480 | orchestrator | 2026-04-07 04:20:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:18.739148 | orchestrator | 2026-04-07 04:20:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:18.741224 | orchestrator | 2026-04-07 04:20:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:18.741308 | orchestrator | 2026-04-07 04:20:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:21.790305 | orchestrator | 2026-04-07 04:20:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:21.791978 | orchestrator | 2026-04-07 04:20:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:21.792066 | orchestrator | 2026-04-07 04:20:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:24.850518 | orchestrator | 2026-04-07 04:20:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:24.852798 | orchestrator | 2026-04-07 04:20:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:24.852914 | orchestrator | 2026-04-07 04:20:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:27.901700 | orchestrator | 2026-04-07 04:20:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:27.903470 | orchestrator | 2026-04-07 04:20:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:27.903570 | orchestrator | 2026-04-07 04:20:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:30.951741 | orchestrator | 2026-04-07 04:20:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:30.954355 | orchestrator | 2026-04-07 04:20:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:30.954420 | orchestrator | 2026-04-07 04:20:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:34.006635 | orchestrator | 2026-04-07 04:20:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:34.008433 | orchestrator | 2026-04-07 04:20:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:34.008621 | orchestrator | 2026-04-07 04:20:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:37.052206 | orchestrator | 2026-04-07 04:20:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:37.054391 | orchestrator | 2026-04-07 04:20:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:37.054475 | orchestrator | 2026-04-07 04:20:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:40.099089 | orchestrator | 2026-04-07 04:20:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:40.099803 | orchestrator | 2026-04-07 04:20:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:40.099905 | orchestrator | 2026-04-07 04:20:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:43.143840 | orchestrator | 2026-04-07 04:20:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:43.146528 | orchestrator | 2026-04-07 04:20:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:43.146600 | orchestrator | 2026-04-07 04:20:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:46.198486 | orchestrator | 2026-04-07 04:20:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:46.201502 | orchestrator | 2026-04-07 04:20:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:46.201639 | orchestrator | 2026-04-07 04:20:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:49.245914 | orchestrator | 2026-04-07 04:20:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:49.246529 | orchestrator | 2026-04-07 04:20:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:49.246561 | orchestrator | 2026-04-07 04:20:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:52.291609 | orchestrator | 2026-04-07 04:20:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:52.292699 | orchestrator | 2026-04-07 04:20:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:52.292751 | orchestrator | 2026-04-07 04:20:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:55.344391 | orchestrator | 2026-04-07 04:20:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:55.347568 | orchestrator | 2026-04-07 04:20:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:55.348416 | orchestrator | 2026-04-07 04:20:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:20:58.394856 | orchestrator | 2026-04-07 04:20:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:20:58.397619 | orchestrator | 2026-04-07 04:20:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:20:58.397688 | orchestrator | 2026-04-07 04:20:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:01.446947 | orchestrator | 2026-04-07 04:21:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:01.447911 | orchestrator | 2026-04-07 04:21:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:01.447958 | orchestrator | 2026-04-07 04:21:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:04.503082 | orchestrator | 2026-04-07 04:21:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:04.504681 | orchestrator | 2026-04-07 04:21:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:04.504836 | orchestrator | 2026-04-07 04:21:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:07.555481 | orchestrator | 2026-04-07 04:21:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:07.556856 | orchestrator | 2026-04-07 04:21:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:07.556926 | orchestrator | 2026-04-07 04:21:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:10.603420 | orchestrator | 2026-04-07 04:21:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:10.606118 | orchestrator | 2026-04-07 04:21:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:10.606161 | orchestrator | 2026-04-07 04:21:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:13.650646 | orchestrator | 2026-04-07 04:21:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:13.651468 | orchestrator | 2026-04-07 04:21:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:13.651508 | orchestrator | 2026-04-07 04:21:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:16.701642 | orchestrator | 2026-04-07 04:21:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:16.703340 | orchestrator | 2026-04-07 04:21:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:16.703390 | orchestrator | 2026-04-07 04:21:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:19.751554 | orchestrator | 2026-04-07 04:21:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:19.753957 | orchestrator | 2026-04-07 04:21:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:19.754090 | orchestrator | 2026-04-07 04:21:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:22.795215 | orchestrator | 2026-04-07 04:21:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:22.795545 | orchestrator | 2026-04-07 04:21:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:22.795585 | orchestrator | 2026-04-07 04:21:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:25.839778 | orchestrator | 2026-04-07 04:21:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:25.841253 | orchestrator | 2026-04-07 04:21:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:25.841294 | orchestrator | 2026-04-07 04:21:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:28.879806 | orchestrator | 2026-04-07 04:21:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:28.881645 | orchestrator | 2026-04-07 04:21:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:28.881737 | orchestrator | 2026-04-07 04:21:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:31.932528 | orchestrator | 2026-04-07 04:21:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:31.933745 | orchestrator | 2026-04-07 04:21:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:31.934352 | orchestrator | 2026-04-07 04:21:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:34.986100 | orchestrator | 2026-04-07 04:21:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:34.988787 | orchestrator | 2026-04-07 04:21:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:34.988907 | orchestrator | 2026-04-07 04:21:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:38.042347 | orchestrator | 2026-04-07 04:21:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:38.044728 | orchestrator | 2026-04-07 04:21:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:38.044842 | orchestrator | 2026-04-07 04:21:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:41.090297 | orchestrator | 2026-04-07 04:21:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:41.093266 | orchestrator | 2026-04-07 04:21:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:41.093364 | orchestrator | 2026-04-07 04:21:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:44.138571 | orchestrator | 2026-04-07 04:21:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:44.141533 | orchestrator | 2026-04-07 04:21:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:44.141629 | orchestrator | 2026-04-07 04:21:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:47.186664 | orchestrator | 2026-04-07 04:21:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:47.187753 | orchestrator | 2026-04-07 04:21:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:47.187824 | orchestrator | 2026-04-07 04:21:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:50.234210 | orchestrator | 2026-04-07 04:21:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:50.235726 | orchestrator | 2026-04-07 04:21:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:50.235777 | orchestrator | 2026-04-07 04:21:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:53.285887 | orchestrator | 2026-04-07 04:21:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:53.287508 | orchestrator | 2026-04-07 04:21:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:53.287627 | orchestrator | 2026-04-07 04:21:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:56.335932 | orchestrator | 2026-04-07 04:21:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:56.338908 | orchestrator | 2026-04-07 04:21:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:56.339459 | orchestrator | 2026-04-07 04:21:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:21:59.388209 | orchestrator | 2026-04-07 04:21:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:21:59.389719 | orchestrator | 2026-04-07 04:21:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:21:59.390148 | orchestrator | 2026-04-07 04:21:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:02.436661 | orchestrator | 2026-04-07 04:22:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:02.438112 | orchestrator | 2026-04-07 04:22:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:02.438155 | orchestrator | 2026-04-07 04:22:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:05.489244 | orchestrator | 2026-04-07 04:22:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:05.490891 | orchestrator | 2026-04-07 04:22:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:05.491044 | orchestrator | 2026-04-07 04:22:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:08.533371 | orchestrator | 2026-04-07 04:22:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:08.535604 | orchestrator | 2026-04-07 04:22:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:08.535681 | orchestrator | 2026-04-07 04:22:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:11.575132 | orchestrator | 2026-04-07 04:22:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:11.576609 | orchestrator | 2026-04-07 04:22:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:11.576672 | orchestrator | 2026-04-07 04:22:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:14.614793 | orchestrator | 2026-04-07 04:22:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:14.616146 | orchestrator | 2026-04-07 04:22:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:14.616229 | orchestrator | 2026-04-07 04:22:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:17.668588 | orchestrator | 2026-04-07 04:22:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:17.669924 | orchestrator | 2026-04-07 04:22:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:17.669994 | orchestrator | 2026-04-07 04:22:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:20.711942 | orchestrator | 2026-04-07 04:22:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:20.714513 | orchestrator | 2026-04-07 04:22:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:20.714612 | orchestrator | 2026-04-07 04:22:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:23.763151 | orchestrator | 2026-04-07 04:22:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:23.765656 | orchestrator | 2026-04-07 04:22:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:23.765716 | orchestrator | 2026-04-07 04:22:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:26.816782 | orchestrator | 2026-04-07 04:22:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:26.818467 | orchestrator | 2026-04-07 04:22:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:26.818509 | orchestrator | 2026-04-07 04:22:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:29.875101 | orchestrator | 2026-04-07 04:22:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:29.877267 | orchestrator | 2026-04-07 04:22:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:29.877553 | orchestrator | 2026-04-07 04:22:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:32.929302 | orchestrator | 2026-04-07 04:22:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:32.932626 | orchestrator | 2026-04-07 04:22:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:32.932687 | orchestrator | 2026-04-07 04:22:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:35.982237 | orchestrator | 2026-04-07 04:22:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:35.983890 | orchestrator | 2026-04-07 04:22:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:35.984080 | orchestrator | 2026-04-07 04:22:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:39.037672 | orchestrator | 2026-04-07 04:22:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:39.039487 | orchestrator | 2026-04-07 04:22:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:39.039547 | orchestrator | 2026-04-07 04:22:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:42.086522 | orchestrator | 2026-04-07 04:22:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:42.089218 | orchestrator | 2026-04-07 04:22:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:42.089541 | orchestrator | 2026-04-07 04:22:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:45.134884 | orchestrator | 2026-04-07 04:22:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:45.136143 | orchestrator | 2026-04-07 04:22:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:45.136181 | orchestrator | 2026-04-07 04:22:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:48.186847 | orchestrator | 2026-04-07 04:22:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:48.188486 | orchestrator | 2026-04-07 04:22:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:48.188567 | orchestrator | 2026-04-07 04:22:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:51.231427 | orchestrator | 2026-04-07 04:22:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:51.233502 | orchestrator | 2026-04-07 04:22:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:51.233543 | orchestrator | 2026-04-07 04:22:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:54.283048 | orchestrator | 2026-04-07 04:22:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:54.285861 | orchestrator | 2026-04-07 04:22:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:54.285976 | orchestrator | 2026-04-07 04:22:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:22:57.329994 | orchestrator | 2026-04-07 04:22:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:22:57.330791 | orchestrator | 2026-04-07 04:22:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:22:57.330859 | orchestrator | 2026-04-07 04:22:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:00.378852 | orchestrator | 2026-04-07 04:23:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:00.379833 | orchestrator | 2026-04-07 04:23:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:00.380262 | orchestrator | 2026-04-07 04:23:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:03.429226 | orchestrator | 2026-04-07 04:23:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:03.431887 | orchestrator | 2026-04-07 04:23:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:03.431961 | orchestrator | 2026-04-07 04:23:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:06.486860 | orchestrator | 2026-04-07 04:23:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:06.489049 | orchestrator | 2026-04-07 04:23:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:06.489098 | orchestrator | 2026-04-07 04:23:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:09.538221 | orchestrator | 2026-04-07 04:23:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:09.539697 | orchestrator | 2026-04-07 04:23:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:09.539760 | orchestrator | 2026-04-07 04:23:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:12.585221 | orchestrator | 2026-04-07 04:23:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:12.586113 | orchestrator | 2026-04-07 04:23:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:12.586174 | orchestrator | 2026-04-07 04:23:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:15.632444 | orchestrator | 2026-04-07 04:23:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:15.633137 | orchestrator | 2026-04-07 04:23:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:15.633782 | orchestrator | 2026-04-07 04:23:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:18.685782 | orchestrator | 2026-04-07 04:23:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:18.687207 | orchestrator | 2026-04-07 04:23:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:18.687249 | orchestrator | 2026-04-07 04:23:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:21.732558 | orchestrator | 2026-04-07 04:23:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:21.734358 | orchestrator | 2026-04-07 04:23:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:21.734535 | orchestrator | 2026-04-07 04:23:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:24.780806 | orchestrator | 2026-04-07 04:23:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:24.782937 | orchestrator | 2026-04-07 04:23:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:24.782999 | orchestrator | 2026-04-07 04:23:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:27.830396 | orchestrator | 2026-04-07 04:23:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:27.831752 | orchestrator | 2026-04-07 04:23:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:27.831803 | orchestrator | 2026-04-07 04:23:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:30.878988 | orchestrator | 2026-04-07 04:23:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:30.880600 | orchestrator | 2026-04-07 04:23:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:30.880670 | orchestrator | 2026-04-07 04:23:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:33.927276 | orchestrator | 2026-04-07 04:23:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:33.929228 | orchestrator | 2026-04-07 04:23:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:33.929294 | orchestrator | 2026-04-07 04:23:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:36.976875 | orchestrator | 2026-04-07 04:23:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:36.978852 | orchestrator | 2026-04-07 04:23:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:36.978918 | orchestrator | 2026-04-07 04:23:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:40.022751 | orchestrator | 2026-04-07 04:23:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:40.024709 | orchestrator | 2026-04-07 04:23:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:40.024756 | orchestrator | 2026-04-07 04:23:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:43.078111 | orchestrator | 2026-04-07 04:23:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:43.079942 | orchestrator | 2026-04-07 04:23:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:43.080054 | orchestrator | 2026-04-07 04:23:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:46.133633 | orchestrator | 2026-04-07 04:23:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:46.135343 | orchestrator | 2026-04-07 04:23:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:46.135389 | orchestrator | 2026-04-07 04:23:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:49.179344 | orchestrator | 2026-04-07 04:23:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:49.182282 | orchestrator | 2026-04-07 04:23:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:49.182342 | orchestrator | 2026-04-07 04:23:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:52.227153 | orchestrator | 2026-04-07 04:23:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:52.229137 | orchestrator | 2026-04-07 04:23:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:52.229193 | orchestrator | 2026-04-07 04:23:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:55.276453 | orchestrator | 2026-04-07 04:23:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:55.278539 | orchestrator | 2026-04-07 04:23:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:55.278592 | orchestrator | 2026-04-07 04:23:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:23:58.326615 | orchestrator | 2026-04-07 04:23:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:23:58.328804 | orchestrator | 2026-04-07 04:23:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:23:58.328885 | orchestrator | 2026-04-07 04:23:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:01.374452 | orchestrator | 2026-04-07 04:24:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:01.376278 | orchestrator | 2026-04-07 04:24:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:01.376349 | orchestrator | 2026-04-07 04:24:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:04.429128 | orchestrator | 2026-04-07 04:24:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:04.431710 | orchestrator | 2026-04-07 04:24:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:04.431793 | orchestrator | 2026-04-07 04:24:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:07.485026 | orchestrator | 2026-04-07 04:24:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:07.487118 | orchestrator | 2026-04-07 04:24:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:07.487211 | orchestrator | 2026-04-07 04:24:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:10.534673 | orchestrator | 2026-04-07 04:24:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:10.536671 | orchestrator | 2026-04-07 04:24:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:10.536808 | orchestrator | 2026-04-07 04:24:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:13.584491 | orchestrator | 2026-04-07 04:24:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:13.586395 | orchestrator | 2026-04-07 04:24:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:13.586571 | orchestrator | 2026-04-07 04:24:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:16.636412 | orchestrator | 2026-04-07 04:24:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:16.637575 | orchestrator | 2026-04-07 04:24:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:16.637636 | orchestrator | 2026-04-07 04:24:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:19.689389 | orchestrator | 2026-04-07 04:24:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:19.690660 | orchestrator | 2026-04-07 04:24:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:19.690713 | orchestrator | 2026-04-07 04:24:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:22.737525 | orchestrator | 2026-04-07 04:24:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:22.738953 | orchestrator | 2026-04-07 04:24:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:22.738994 | orchestrator | 2026-04-07 04:24:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:25.783535 | orchestrator | 2026-04-07 04:24:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:25.784986 | orchestrator | 2026-04-07 04:24:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:25.785015 | orchestrator | 2026-04-07 04:24:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:28.831200 | orchestrator | 2026-04-07 04:24:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:28.833060 | orchestrator | 2026-04-07 04:24:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:28.833127 | orchestrator | 2026-04-07 04:24:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:31.878733 | orchestrator | 2026-04-07 04:24:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:31.880470 | orchestrator | 2026-04-07 04:24:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:31.880531 | orchestrator | 2026-04-07 04:24:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:34.927376 | orchestrator | 2026-04-07 04:24:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:34.929263 | orchestrator | 2026-04-07 04:24:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:34.929286 | orchestrator | 2026-04-07 04:24:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:37.977604 | orchestrator | 2026-04-07 04:24:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:37.980433 | orchestrator | 2026-04-07 04:24:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:37.980588 | orchestrator | 2026-04-07 04:24:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:41.032449 | orchestrator | 2026-04-07 04:24:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:41.034733 | orchestrator | 2026-04-07 04:24:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:41.034792 | orchestrator | 2026-04-07 04:24:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:44.080444 | orchestrator | 2026-04-07 04:24:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:44.082586 | orchestrator | 2026-04-07 04:24:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:44.082697 | orchestrator | 2026-04-07 04:24:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:47.135830 | orchestrator | 2026-04-07 04:24:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:47.137560 | orchestrator | 2026-04-07 04:24:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:47.137597 | orchestrator | 2026-04-07 04:24:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:50.188377 | orchestrator | 2026-04-07 04:24:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:50.190668 | orchestrator | 2026-04-07 04:24:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:50.190731 | orchestrator | 2026-04-07 04:24:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:53.235265 | orchestrator | 2026-04-07 04:24:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:53.237153 | orchestrator | 2026-04-07 04:24:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:53.237196 | orchestrator | 2026-04-07 04:24:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:56.280155 | orchestrator | 2026-04-07 04:24:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:56.282006 | orchestrator | 2026-04-07 04:24:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:56.282164 | orchestrator | 2026-04-07 04:24:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:24:59.329452 | orchestrator | 2026-04-07 04:24:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:24:59.331097 | orchestrator | 2026-04-07 04:24:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:24:59.331145 | orchestrator | 2026-04-07 04:24:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:02.376157 | orchestrator | 2026-04-07 04:25:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:02.376365 | orchestrator | 2026-04-07 04:25:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:02.376483 | orchestrator | 2026-04-07 04:25:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:05.426413 | orchestrator | 2026-04-07 04:25:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:05.428308 | orchestrator | 2026-04-07 04:25:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:05.428376 | orchestrator | 2026-04-07 04:25:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:08.477621 | orchestrator | 2026-04-07 04:25:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:08.479538 | orchestrator | 2026-04-07 04:25:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:08.479683 | orchestrator | 2026-04-07 04:25:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:11.532632 | orchestrator | 2026-04-07 04:25:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:11.534585 | orchestrator | 2026-04-07 04:25:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:11.534698 | orchestrator | 2026-04-07 04:25:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:14.586442 | orchestrator | 2026-04-07 04:25:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:14.588262 | orchestrator | 2026-04-07 04:25:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:14.588330 | orchestrator | 2026-04-07 04:25:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:17.638217 | orchestrator | 2026-04-07 04:25:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:17.641044 | orchestrator | 2026-04-07 04:25:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:17.641063 | orchestrator | 2026-04-07 04:25:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:20.687888 | orchestrator | 2026-04-07 04:25:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:20.690075 | orchestrator | 2026-04-07 04:25:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:20.690152 | orchestrator | 2026-04-07 04:25:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:23.738339 | orchestrator | 2026-04-07 04:25:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:23.741287 | orchestrator | 2026-04-07 04:25:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:23.741403 | orchestrator | 2026-04-07 04:25:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:26.792434 | orchestrator | 2026-04-07 04:25:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:26.794069 | orchestrator | 2026-04-07 04:25:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:26.794471 | orchestrator | 2026-04-07 04:25:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:29.841792 | orchestrator | 2026-04-07 04:25:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:29.844055 | orchestrator | 2026-04-07 04:25:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:29.844137 | orchestrator | 2026-04-07 04:25:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:32.894132 | orchestrator | 2026-04-07 04:25:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:32.897398 | orchestrator | 2026-04-07 04:25:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:32.897476 | orchestrator | 2026-04-07 04:25:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:35.944193 | orchestrator | 2026-04-07 04:25:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:35.946339 | orchestrator | 2026-04-07 04:25:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:35.946400 | orchestrator | 2026-04-07 04:25:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:38.985540 | orchestrator | 2026-04-07 04:25:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:38.987584 | orchestrator | 2026-04-07 04:25:38 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:38.987703 | orchestrator | 2026-04-07 04:25:38 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:42.034924 | orchestrator | 2026-04-07 04:25:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:42.036922 | orchestrator | 2026-04-07 04:25:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:42.036979 | orchestrator | 2026-04-07 04:25:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:45.081944 | orchestrator | 2026-04-07 04:25:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:45.083859 | orchestrator | 2026-04-07 04:25:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:45.083909 | orchestrator | 2026-04-07 04:25:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:48.130162 | orchestrator | 2026-04-07 04:25:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:48.132511 | orchestrator | 2026-04-07 04:25:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:48.132561 | orchestrator | 2026-04-07 04:25:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:51.189508 | orchestrator | 2026-04-07 04:25:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:51.191023 | orchestrator | 2026-04-07 04:25:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:51.191054 | orchestrator | 2026-04-07 04:25:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:54.245120 | orchestrator | 2026-04-07 04:25:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:54.245951 | orchestrator | 2026-04-07 04:25:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:54.246059 | orchestrator | 2026-04-07 04:25:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:25:57.296339 | orchestrator | 2026-04-07 04:25:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:25:57.297012 | orchestrator | 2026-04-07 04:25:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:25:57.297176 | orchestrator | 2026-04-07 04:25:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:00.347574 | orchestrator | 2026-04-07 04:26:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:00.349296 | orchestrator | 2026-04-07 04:26:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:00.349507 | orchestrator | 2026-04-07 04:26:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:03.398992 | orchestrator | 2026-04-07 04:26:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:03.402379 | orchestrator | 2026-04-07 04:26:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:03.402461 | orchestrator | 2026-04-07 04:26:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:06.451961 | orchestrator | 2026-04-07 04:26:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:06.453132 | orchestrator | 2026-04-07 04:26:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:06.453174 | orchestrator | 2026-04-07 04:26:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:09.502494 | orchestrator | 2026-04-07 04:26:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:09.505739 | orchestrator | 2026-04-07 04:26:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:09.505803 | orchestrator | 2026-04-07 04:26:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:12.550650 | orchestrator | 2026-04-07 04:26:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:12.552349 | orchestrator | 2026-04-07 04:26:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:12.552552 | orchestrator | 2026-04-07 04:26:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:15.601847 | orchestrator | 2026-04-07 04:26:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:15.604585 | orchestrator | 2026-04-07 04:26:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:15.604653 | orchestrator | 2026-04-07 04:26:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:18.645617 | orchestrator | 2026-04-07 04:26:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:18.647268 | orchestrator | 2026-04-07 04:26:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:18.647338 | orchestrator | 2026-04-07 04:26:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:21.694537 | orchestrator | 2026-04-07 04:26:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:21.696199 | orchestrator | 2026-04-07 04:26:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:21.696251 | orchestrator | 2026-04-07 04:26:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:24.749477 | orchestrator | 2026-04-07 04:26:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:24.750504 | orchestrator | 2026-04-07 04:26:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:24.750653 | orchestrator | 2026-04-07 04:26:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:27.800962 | orchestrator | 2026-04-07 04:26:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:27.803036 | orchestrator | 2026-04-07 04:26:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:27.803128 | orchestrator | 2026-04-07 04:26:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:30.851772 | orchestrator | 2026-04-07 04:26:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:30.853067 | orchestrator | 2026-04-07 04:26:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:30.853397 | orchestrator | 2026-04-07 04:26:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:33.916857 | orchestrator | 2026-04-07 04:26:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:33.919649 | orchestrator | 2026-04-07 04:26:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:33.919731 | orchestrator | 2026-04-07 04:26:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:36.969660 | orchestrator | 2026-04-07 04:26:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:36.971437 | orchestrator | 2026-04-07 04:26:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:36.971485 | orchestrator | 2026-04-07 04:26:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:40.021975 | orchestrator | 2026-04-07 04:26:40 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:40.024584 | orchestrator | 2026-04-07 04:26:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:40.024689 | orchestrator | 2026-04-07 04:26:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:43.066072 | orchestrator | 2026-04-07 04:26:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:43.067233 | orchestrator | 2026-04-07 04:26:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:43.067330 | orchestrator | 2026-04-07 04:26:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:46.116120 | orchestrator | 2026-04-07 04:26:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:46.117580 | orchestrator | 2026-04-07 04:26:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:46.117623 | orchestrator | 2026-04-07 04:26:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:49.160668 | orchestrator | 2026-04-07 04:26:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:49.162674 | orchestrator | 2026-04-07 04:26:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:49.162721 | orchestrator | 2026-04-07 04:26:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:52.213711 | orchestrator | 2026-04-07 04:26:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:52.215333 | orchestrator | 2026-04-07 04:26:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:52.215372 | orchestrator | 2026-04-07 04:26:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:55.267066 | orchestrator | 2026-04-07 04:26:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:55.267971 | orchestrator | 2026-04-07 04:26:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:55.268009 | orchestrator | 2026-04-07 04:26:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:26:58.314359 | orchestrator | 2026-04-07 04:26:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:26:58.316653 | orchestrator | 2026-04-07 04:26:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:26:58.316722 | orchestrator | 2026-04-07 04:26:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:01.360148 | orchestrator | 2026-04-07 04:27:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:01.362570 | orchestrator | 2026-04-07 04:27:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:01.362631 | orchestrator | 2026-04-07 04:27:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:04.409964 | orchestrator | 2026-04-07 04:27:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:04.411644 | orchestrator | 2026-04-07 04:27:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:04.411839 | orchestrator | 2026-04-07 04:27:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:07.459513 | orchestrator | 2026-04-07 04:27:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:07.462248 | orchestrator | 2026-04-07 04:27:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:07.462371 | orchestrator | 2026-04-07 04:27:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:10.518615 | orchestrator | 2026-04-07 04:27:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:10.519435 | orchestrator | 2026-04-07 04:27:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:10.519627 | orchestrator | 2026-04-07 04:27:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:13.564760 | orchestrator | 2026-04-07 04:27:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:13.566256 | orchestrator | 2026-04-07 04:27:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:13.566290 | orchestrator | 2026-04-07 04:27:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:16.614316 | orchestrator | 2026-04-07 04:27:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:16.616527 | orchestrator | 2026-04-07 04:27:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:16.616572 | orchestrator | 2026-04-07 04:27:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:19.660611 | orchestrator | 2026-04-07 04:27:19 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:19.662247 | orchestrator | 2026-04-07 04:27:19 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:19.662450 | orchestrator | 2026-04-07 04:27:19 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:22.706960 | orchestrator | 2026-04-07 04:27:22 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:22.708543 | orchestrator | 2026-04-07 04:27:22 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:22.708603 | orchestrator | 2026-04-07 04:27:22 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:25.752494 | orchestrator | 2026-04-07 04:27:25 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:25.755988 | orchestrator | 2026-04-07 04:27:25 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:25.756063 | orchestrator | 2026-04-07 04:27:25 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:28.802784 | orchestrator | 2026-04-07 04:27:28 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:28.803676 | orchestrator | 2026-04-07 04:27:28 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:28.803702 | orchestrator | 2026-04-07 04:27:28 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:31.858056 | orchestrator | 2026-04-07 04:27:31 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:31.859076 | orchestrator | 2026-04-07 04:27:31 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:31.859331 | orchestrator | 2026-04-07 04:27:31 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:34.904793 | orchestrator | 2026-04-07 04:27:34 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:34.906589 | orchestrator | 2026-04-07 04:27:34 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:34.906673 | orchestrator | 2026-04-07 04:27:34 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:37.953944 | orchestrator | 2026-04-07 04:27:37 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:37.956264 | orchestrator | 2026-04-07 04:27:37 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:37.956397 | orchestrator | 2026-04-07 04:27:37 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:41.003312 | orchestrator | 2026-04-07 04:27:41 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:41.005032 | orchestrator | 2026-04-07 04:27:41 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:41.005161 | orchestrator | 2026-04-07 04:27:41 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:44.045610 | orchestrator | 2026-04-07 04:27:44 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:44.047343 | orchestrator | 2026-04-07 04:27:44 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:44.047396 | orchestrator | 2026-04-07 04:27:44 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:47.106229 | orchestrator | 2026-04-07 04:27:47 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:47.107567 | orchestrator | 2026-04-07 04:27:47 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:47.107655 | orchestrator | 2026-04-07 04:27:47 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:50.155592 | orchestrator | 2026-04-07 04:27:50 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:50.160174 | orchestrator | 2026-04-07 04:27:50 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:50.160260 | orchestrator | 2026-04-07 04:27:50 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:53.206588 | orchestrator | 2026-04-07 04:27:53 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:53.207595 | orchestrator | 2026-04-07 04:27:53 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:53.207625 | orchestrator | 2026-04-07 04:27:53 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:56.256269 | orchestrator | 2026-04-07 04:27:56 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:56.256731 | orchestrator | 2026-04-07 04:27:56 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:56.257261 | orchestrator | 2026-04-07 04:27:56 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:27:59.307217 | orchestrator | 2026-04-07 04:27:59 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:27:59.309659 | orchestrator | 2026-04-07 04:27:59 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:27:59.309791 | orchestrator | 2026-04-07 04:27:59 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:02.364827 | orchestrator | 2026-04-07 04:28:02 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:02.366928 | orchestrator | 2026-04-07 04:28:02 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:02.366971 | orchestrator | 2026-04-07 04:28:02 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:05.410889 | orchestrator | 2026-04-07 04:28:05 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:05.412752 | orchestrator | 2026-04-07 04:28:05 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:05.412940 | orchestrator | 2026-04-07 04:28:05 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:08.458194 | orchestrator | 2026-04-07 04:28:08 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:08.459698 | orchestrator | 2026-04-07 04:28:08 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:08.459811 | orchestrator | 2026-04-07 04:28:08 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:11.514802 | orchestrator | 2026-04-07 04:28:11 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:11.516150 | orchestrator | 2026-04-07 04:28:11 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:11.516214 | orchestrator | 2026-04-07 04:28:11 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:14.565745 | orchestrator | 2026-04-07 04:28:14 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:14.568205 | orchestrator | 2026-04-07 04:28:14 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:14.568282 | orchestrator | 2026-04-07 04:28:14 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:17.625810 | orchestrator | 2026-04-07 04:28:17 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:17.627716 | orchestrator | 2026-04-07 04:28:17 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:17.627781 | orchestrator | 2026-04-07 04:28:17 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:20.677767 | orchestrator | 2026-04-07 04:28:20 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:20.679195 | orchestrator | 2026-04-07 04:28:20 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:20.679498 | orchestrator | 2026-04-07 04:28:20 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:23.732838 | orchestrator | 2026-04-07 04:28:23 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:23.735038 | orchestrator | 2026-04-07 04:28:23 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:23.735134 | orchestrator | 2026-04-07 04:28:23 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:26.789793 | orchestrator | 2026-04-07 04:28:26 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:26.791684 | orchestrator | 2026-04-07 04:28:26 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:26.791737 | orchestrator | 2026-04-07 04:28:26 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:29.839888 | orchestrator | 2026-04-07 04:28:29 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:29.842094 | orchestrator | 2026-04-07 04:28:29 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:29.842148 | orchestrator | 2026-04-07 04:28:29 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:32.890524 | orchestrator | 2026-04-07 04:28:32 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:32.891839 | orchestrator | 2026-04-07 04:28:32 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:32.891888 | orchestrator | 2026-04-07 04:28:32 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:35.942522 | orchestrator | 2026-04-07 04:28:35 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:35.945558 | orchestrator | 2026-04-07 04:28:35 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:35.945624 | orchestrator | 2026-04-07 04:28:35 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:38.999074 | orchestrator | 2026-04-07 04:28:38 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:39.002540 | orchestrator | 2026-04-07 04:28:39 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:39.002673 | orchestrator | 2026-04-07 04:28:39 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:42.053523 | orchestrator | 2026-04-07 04:28:42 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:42.054725 | orchestrator | 2026-04-07 04:28:42 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:42.054770 | orchestrator | 2026-04-07 04:28:42 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:45.095151 | orchestrator | 2026-04-07 04:28:45 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:45.096202 | orchestrator | 2026-04-07 04:28:45 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:45.096255 | orchestrator | 2026-04-07 04:28:45 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:48.149585 | orchestrator | 2026-04-07 04:28:48 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:48.151193 | orchestrator | 2026-04-07 04:28:48 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:48.151256 | orchestrator | 2026-04-07 04:28:48 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:51.201745 | orchestrator | 2026-04-07 04:28:51 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:51.202885 | orchestrator | 2026-04-07 04:28:51 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:51.202935 | orchestrator | 2026-04-07 04:28:51 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:54.248348 | orchestrator | 2026-04-07 04:28:54 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:54.249589 | orchestrator | 2026-04-07 04:28:54 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:54.249644 | orchestrator | 2026-04-07 04:28:54 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:28:57.300179 | orchestrator | 2026-04-07 04:28:57 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:28:57.302933 | orchestrator | 2026-04-07 04:28:57 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:28:57.303009 | orchestrator | 2026-04-07 04:28:57 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:00.357847 | orchestrator | 2026-04-07 04:29:00 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:00.359539 | orchestrator | 2026-04-07 04:29:00 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:00.359594 | orchestrator | 2026-04-07 04:29:00 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:03.408667 | orchestrator | 2026-04-07 04:29:03 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:03.410185 | orchestrator | 2026-04-07 04:29:03 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:03.410272 | orchestrator | 2026-04-07 04:29:03 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:06.457421 | orchestrator | 2026-04-07 04:29:06 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:06.459382 | orchestrator | 2026-04-07 04:29:06 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:06.459443 | orchestrator | 2026-04-07 04:29:06 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:09.508934 | orchestrator | 2026-04-07 04:29:09 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:09.512078 | orchestrator | 2026-04-07 04:29:09 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:09.512107 | orchestrator | 2026-04-07 04:29:09 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:12.559571 | orchestrator | 2026-04-07 04:29:12 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:12.560651 | orchestrator | 2026-04-07 04:29:12 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:12.561309 | orchestrator | 2026-04-07 04:29:12 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:15.604136 | orchestrator | 2026-04-07 04:29:15 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:15.605908 | orchestrator | 2026-04-07 04:29:15 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:15.605974 | orchestrator | 2026-04-07 04:29:15 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:18.656702 | orchestrator | 2026-04-07 04:29:18 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:18.657687 | orchestrator | 2026-04-07 04:29:18 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:18.657708 | orchestrator | 2026-04-07 04:29:18 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:21.699496 | orchestrator | 2026-04-07 04:29:21 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:21.700474 | orchestrator | 2026-04-07 04:29:21 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:21.700520 | orchestrator | 2026-04-07 04:29:21 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:24.748478 | orchestrator | 2026-04-07 04:29:24 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:24.751529 | orchestrator | 2026-04-07 04:29:24 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:24.751599 | orchestrator | 2026-04-07 04:29:24 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:27.798184 | orchestrator | 2026-04-07 04:29:27 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:27.799906 | orchestrator | 2026-04-07 04:29:27 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:27.799949 | orchestrator | 2026-04-07 04:29:27 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:30.839241 | orchestrator | 2026-04-07 04:29:30 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:30.841918 | orchestrator | 2026-04-07 04:29:30 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:30.841974 | orchestrator | 2026-04-07 04:29:30 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:33.890624 | orchestrator | 2026-04-07 04:29:33 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:33.893234 | orchestrator | 2026-04-07 04:29:33 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:33.893269 | orchestrator | 2026-04-07 04:29:33 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:36.946983 | orchestrator | 2026-04-07 04:29:36 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:36.949150 | orchestrator | 2026-04-07 04:29:36 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:36.949258 | orchestrator | 2026-04-07 04:29:36 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:39.999558 | orchestrator | 2026-04-07 04:29:39 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:40.001065 | orchestrator | 2026-04-07 04:29:40 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:40.001137 | orchestrator | 2026-04-07 04:29:40 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:43.055648 | orchestrator | 2026-04-07 04:29:43 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:43.056971 | orchestrator | 2026-04-07 04:29:43 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:43.057052 | orchestrator | 2026-04-07 04:29:43 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:46.098987 | orchestrator | 2026-04-07 04:29:46 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:46.099914 | orchestrator | 2026-04-07 04:29:46 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:46.099942 | orchestrator | 2026-04-07 04:29:46 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:49.148743 | orchestrator | 2026-04-07 04:29:49 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:49.151118 | orchestrator | 2026-04-07 04:29:49 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:49.151162 | orchestrator | 2026-04-07 04:29:49 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:52.196562 | orchestrator | 2026-04-07 04:29:52 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:52.198983 | orchestrator | 2026-04-07 04:29:52 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:52.199032 | orchestrator | 2026-04-07 04:29:52 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:55.248520 | orchestrator | 2026-04-07 04:29:55 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:55.250654 | orchestrator | 2026-04-07 04:29:55 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:55.250701 | orchestrator | 2026-04-07 04:29:55 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:29:58.304191 | orchestrator | 2026-04-07 04:29:58 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:29:58.305931 | orchestrator | 2026-04-07 04:29:58 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:29:58.305985 | orchestrator | 2026-04-07 04:29:58 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:01.352539 | orchestrator | 2026-04-07 04:30:01 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:30:01.354980 | orchestrator | 2026-04-07 04:30:01 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:30:01.355056 | orchestrator | 2026-04-07 04:30:01 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:04.403288 | orchestrator | 2026-04-07 04:30:04 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:30:04.404120 | orchestrator | 2026-04-07 04:30:04 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:30:04.404175 | orchestrator | 2026-04-07 04:30:04 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:07.453296 | orchestrator | 2026-04-07 04:30:07 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:30:07.455223 | orchestrator | 2026-04-07 04:30:07 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:30:07.455271 | orchestrator | 2026-04-07 04:30:07 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:10.502183 | orchestrator | 2026-04-07 04:30:10 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:30:10.502853 | orchestrator | 2026-04-07 04:30:10 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:30:10.502906 | orchestrator | 2026-04-07 04:30:10 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:13.558236 | orchestrator | 2026-04-07 04:30:13 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:30:13.560793 | orchestrator | 2026-04-07 04:30:13 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:30:13.560854 | orchestrator | 2026-04-07 04:30:13 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:16.606938 | orchestrator | 2026-04-07 04:30:16 | INFO  | Task 5c9c2e81-ac33-4989-aac3-1fbbd4026c04 is in state STARTED 2026-04-07 04:30:16.607042 | orchestrator | 2026-04-07 04:30:16 | INFO  | Task 0c3f117b-00d6-4eb1-badc-6e4180f59dd7 is in state STARTED 2026-04-07 04:30:16.607074 | orchestrator | 2026-04-07 04:30:16 | INFO  | Wait 1 second(s) until the next check 2026-04-07 04:30:19.580742 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2026-04-07 04:30:19.585639 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2026-04-07 04:30:20.562046 | 2026-04-07 04:30:20.562331 | PLAY [Post output play] 2026-04-07 04:30:20.594965 | 2026-04-07 04:30:20.595331 | LOOP [stage-output : Register sources] 2026-04-07 04:30:20.672286 | 2026-04-07 04:30:20.672580 | TASK [stage-output : Check sudo] 2026-04-07 04:30:21.529170 | orchestrator | sudo: a password is required 2026-04-07 04:30:21.713580 | orchestrator | ok: Runtime: 0:00:00.014496 2026-04-07 04:30:21.728313 | 2026-04-07 04:30:21.728495 | LOOP [stage-output : Set source and destination for files and folders] 2026-04-07 04:30:21.764266 | 2026-04-07 04:30:21.764524 | TASK [stage-output : Build a list of source, dest dictionaries] 2026-04-07 04:30:21.834395 | orchestrator | ok 2026-04-07 04:30:21.843464 | 2026-04-07 04:30:21.843622 | LOOP [stage-output : Ensure target folders exist] 2026-04-07 04:30:22.324563 | orchestrator | ok: "docs" 2026-04-07 04:30:22.324909 | 2026-04-07 04:30:22.603940 | orchestrator | ok: "artifacts" 2026-04-07 04:30:22.862600 | orchestrator | ok: "logs" 2026-04-07 04:30:22.877250 | 2026-04-07 04:30:22.877403 | LOOP [stage-output : Copy files and folders to staging folder] 2026-04-07 04:30:22.910020 | 2026-04-07 04:30:22.910286 | TASK [stage-output : Make all log files readable] 2026-04-07 04:30:23.232994 | orchestrator | ok 2026-04-07 04:30:23.243287 | 2026-04-07 04:30:23.243435 | TASK [stage-output : Rename log files that match extensions_to_txt] 2026-04-07 04:30:23.278595 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:23.295742 | 2026-04-07 04:30:23.295890 | TASK [stage-output : Discover log files for compression] 2026-04-07 04:30:23.321004 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:23.338956 | 2026-04-07 04:30:23.339158 | LOOP [stage-output : Archive everything from logs] 2026-04-07 04:30:23.386476 | 2026-04-07 04:30:23.386777 | PLAY [Post cleanup play] 2026-04-07 04:30:23.399260 | 2026-04-07 04:30:23.399392 | TASK [Set cloud fact (Zuul deployment)] 2026-04-07 04:30:23.466408 | orchestrator | ok 2026-04-07 04:30:23.477384 | 2026-04-07 04:30:23.477506 | TASK [Set cloud fact (local deployment)] 2026-04-07 04:30:23.512062 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:23.525676 | 2026-04-07 04:30:23.525815 | TASK [Clean the cloud environment] 2026-04-07 04:30:24.214780 | orchestrator | 2026-04-07 04:30:24 - clean up servers 2026-04-07 04:30:25.162589 | orchestrator | 2026-04-07 04:30:25 - testbed-manager 2026-04-07 04:30:25.258643 | orchestrator | 2026-04-07 04:30:25 - testbed-node-1 2026-04-07 04:30:25.352417 | orchestrator | 2026-04-07 04:30:25 - testbed-node-5 2026-04-07 04:30:25.448610 | orchestrator | 2026-04-07 04:30:25 - testbed-node-2 2026-04-07 04:30:25.551808 | orchestrator | 2026-04-07 04:30:25 - testbed-node-4 2026-04-07 04:30:25.650547 | orchestrator | 2026-04-07 04:30:25 - testbed-node-0 2026-04-07 04:30:25.747777 | orchestrator | 2026-04-07 04:30:25 - testbed-node-3 2026-04-07 04:30:25.852691 | orchestrator | 2026-04-07 04:30:25 - clean up keypairs 2026-04-07 04:30:25.872494 | orchestrator | 2026-04-07 04:30:25 - testbed 2026-04-07 04:30:25.898755 | orchestrator | 2026-04-07 04:30:25 - wait for servers to be gone 2026-04-07 04:30:36.803013 | orchestrator | 2026-04-07 04:30:36 - clean up ports 2026-04-07 04:30:37.005521 | orchestrator | 2026-04-07 04:30:37 - 1ec611f6-3585-4cf4-a202-9aa82078f808 2026-04-07 04:30:37.298834 | orchestrator | 2026-04-07 04:30:37 - 2efc7932-33c1-4954-96fc-f43d4a4cf0cd 2026-04-07 04:30:37.556722 | orchestrator | 2026-04-07 04:30:37 - 3ae34eab-344b-47c8-80d0-5f1ffb9db47a 2026-04-07 04:30:37.809574 | orchestrator | 2026-04-07 04:30:37 - 4500d28f-617f-4d9a-8577-be05dcedd7a7 2026-04-07 04:30:38.027792 | orchestrator | 2026-04-07 04:30:38 - c0c244a0-fd85-4fc9-8f48-f2c9c27d00c3 2026-04-07 04:30:38.293463 | orchestrator | 2026-04-07 04:30:38 - c77eb69f-2367-487d-be5d-bac634e56067 2026-04-07 04:30:38.502869 | orchestrator | 2026-04-07 04:30:38 - dc25f09a-0883-4c56-a9f1-b26f54744cb4 2026-04-07 04:30:38.984227 | orchestrator | 2026-04-07 04:30:38 - clean up volumes 2026-04-07 04:30:39.150512 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-0-node-base 2026-04-07 04:30:39.189640 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-1-node-base 2026-04-07 04:30:39.229575 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-3-node-base 2026-04-07 04:30:39.276139 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-manager-base 2026-04-07 04:30:39.318474 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-4-node-base 2026-04-07 04:30:39.367477 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-2-node-base 2026-04-07 04:30:39.413643 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-5-node-base 2026-04-07 04:30:39.463148 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-3-node-3 2026-04-07 04:30:39.509876 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-8-node-5 2026-04-07 04:30:39.554865 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-2-node-5 2026-04-07 04:30:39.603529 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-6-node-3 2026-04-07 04:30:39.661048 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-4-node-4 2026-04-07 04:30:39.705289 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-5-node-5 2026-04-07 04:30:39.751592 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-7-node-4 2026-04-07 04:30:39.808279 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-1-node-4 2026-04-07 04:30:39.852836 | orchestrator | 2026-04-07 04:30:39 - testbed-volume-0-node-3 2026-04-07 04:30:39.897071 | orchestrator | 2026-04-07 04:30:39 - disconnect routers 2026-04-07 04:30:40.027501 | orchestrator | 2026-04-07 04:30:40 - testbed 2026-04-07 04:30:41.055293 | orchestrator | 2026-04-07 04:30:41 - clean up subnets 2026-04-07 04:30:41.116692 | orchestrator | 2026-04-07 04:30:41 - subnet-testbed-management 2026-04-07 04:30:41.315087 | orchestrator | 2026-04-07 04:30:41 - clean up networks 2026-04-07 04:30:41.492311 | orchestrator | 2026-04-07 04:30:41 - net-testbed-management 2026-04-07 04:30:41.778529 | orchestrator | 2026-04-07 04:30:41 - clean up security groups 2026-04-07 04:30:41.818855 | orchestrator | 2026-04-07 04:30:41 - testbed-node 2026-04-07 04:30:41.927615 | orchestrator | 2026-04-07 04:30:41 - testbed-management 2026-04-07 04:30:42.039510 | orchestrator | 2026-04-07 04:30:42 - clean up floating ips 2026-04-07 04:30:42.080671 | orchestrator | 2026-04-07 04:30:42 - 81.163.192.120 2026-04-07 04:30:42.460882 | orchestrator | 2026-04-07 04:30:42 - clean up routers 2026-04-07 04:30:42.522569 | orchestrator | 2026-04-07 04:30:42 - testbed 2026-04-07 04:30:43.672332 | orchestrator | ok: Runtime: 0:00:19.525975 2026-04-07 04:30:43.676937 | 2026-04-07 04:30:43.677122 | PLAY RECAP 2026-04-07 04:30:43.677234 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2026-04-07 04:30:43.677286 | 2026-04-07 04:30:43.816826 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2026-04-07 04:30:43.818402 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2026-04-07 04:30:44.577768 | 2026-04-07 04:30:44.578162 | PLAY [Cleanup play] 2026-04-07 04:30:44.595161 | 2026-04-07 04:30:44.595301 | TASK [Set cloud fact (Zuul deployment)] 2026-04-07 04:30:44.647293 | orchestrator | ok 2026-04-07 04:30:44.654341 | 2026-04-07 04:30:44.654482 | TASK [Set cloud fact (local deployment)] 2026-04-07 04:30:44.678379 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:44.687044 | 2026-04-07 04:30:44.687188 | TASK [Clean the cloud environment] 2026-04-07 04:30:45.894181 | orchestrator | 2026-04-07 04:30:45 - clean up servers 2026-04-07 04:30:46.470251 | orchestrator | 2026-04-07 04:30:46 - clean up keypairs 2026-04-07 04:30:46.486850 | orchestrator | 2026-04-07 04:30:46 - wait for servers to be gone 2026-04-07 04:30:46.527673 | orchestrator | 2026-04-07 04:30:46 - clean up ports 2026-04-07 04:30:46.609316 | orchestrator | 2026-04-07 04:30:46 - clean up volumes 2026-04-07 04:30:46.679721 | orchestrator | 2026-04-07 04:30:46 - disconnect routers 2026-04-07 04:30:46.705853 | orchestrator | 2026-04-07 04:30:46 - clean up subnets 2026-04-07 04:30:46.727970 | orchestrator | 2026-04-07 04:30:46 - clean up networks 2026-04-07 04:30:46.890991 | orchestrator | 2026-04-07 04:30:46 - clean up security groups 2026-04-07 04:30:46.924752 | orchestrator | 2026-04-07 04:30:46 - clean up floating ips 2026-04-07 04:30:46.960545 | orchestrator | 2026-04-07 04:30:46 - clean up routers 2026-04-07 04:30:47.224282 | orchestrator | ok: Runtime: 0:00:01.491227 2026-04-07 04:30:47.228522 | 2026-04-07 04:30:47.228718 | PLAY RECAP 2026-04-07 04:30:47.228846 | orchestrator | ok: 2 changed: 1 unreachable: 0 failed: 0 skipped: 1 rescued: 0 ignored: 0 2026-04-07 04:30:47.228909 | 2026-04-07 04:30:47.362211 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2026-04-07 04:30:47.363804 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2026-04-07 04:30:48.159718 | 2026-04-07 04:30:48.159885 | PLAY [Base post-fetch] 2026-04-07 04:30:48.183878 | 2026-04-07 04:30:48.184232 | TASK [fetch-output : Set log path for multiple nodes] 2026-04-07 04:30:48.241325 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:48.248217 | 2026-04-07 04:30:48.248373 | TASK [fetch-output : Set log path for single node] 2026-04-07 04:30:48.301833 | orchestrator | ok 2026-04-07 04:30:48.309237 | 2026-04-07 04:30:48.309367 | LOOP [fetch-output : Ensure local output dirs] 2026-04-07 04:30:49.018062 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/aef34ac854674087aa01508de92070da/work/logs" 2026-04-07 04:30:49.297361 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/aef34ac854674087aa01508de92070da/work/artifacts" 2026-04-07 04:30:49.634946 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/aef34ac854674087aa01508de92070da/work/docs" 2026-04-07 04:30:49.658704 | 2026-04-07 04:30:49.658896 | LOOP [fetch-output : Collect logs, artifacts and docs] 2026-04-07 04:30:50.669037 | orchestrator | changed: .d..t...... ./ 2026-04-07 04:30:50.669298 | orchestrator | changed: All items complete 2026-04-07 04:30:50.669339 | 2026-04-07 04:30:51.401726 | orchestrator | changed: .d..t...... ./ 2026-04-07 04:30:52.202436 | orchestrator | changed: .d..t...... ./ 2026-04-07 04:30:52.234636 | 2026-04-07 04:30:52.234791 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2026-04-07 04:30:52.271570 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:52.283494 | orchestrator | skipping: Conditional result was False 2026-04-07 04:30:52.300614 | 2026-04-07 04:30:52.300705 | PLAY RECAP 2026-04-07 04:30:52.300759 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2026-04-07 04:30:52.300786 | 2026-04-07 04:30:52.428058 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2026-04-07 04:30:52.429224 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2026-04-07 04:30:53.219584 | 2026-04-07 04:30:53.219762 | PLAY [Base post] 2026-04-07 04:30:53.235071 | 2026-04-07 04:30:53.235242 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2026-04-07 04:30:54.422906 | orchestrator | changed 2026-04-07 04:30:54.435354 | 2026-04-07 04:30:54.435511 | PLAY RECAP 2026-04-07 04:30:54.435600 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2026-04-07 04:30:54.435685 | 2026-04-07 04:30:54.576269 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2026-04-07 04:30:54.577355 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2026-04-07 04:30:55.413524 | 2026-04-07 04:30:55.413711 | PLAY [Base post-logs] 2026-04-07 04:30:55.425339 | 2026-04-07 04:30:55.425496 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2026-04-07 04:30:55.930867 | localhost | changed 2026-04-07 04:30:55.944041 | 2026-04-07 04:30:55.944281 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2026-04-07 04:30:55.982592 | localhost | ok 2026-04-07 04:30:55.989473 | 2026-04-07 04:30:55.989653 | TASK [Set zuul-log-path fact] 2026-04-07 04:30:56.018960 | localhost | ok 2026-04-07 04:30:56.040206 | 2026-04-07 04:30:56.040393 | TASK [set-zuul-log-path-fact : Set log path for a build] 2026-04-07 04:30:56.067935 | localhost | ok 2026-04-07 04:30:56.073916 | 2026-04-07 04:30:56.074081 | TASK [upload-logs : Create log directories] 2026-04-07 04:30:56.629223 | localhost | changed 2026-04-07 04:30:56.632143 | 2026-04-07 04:30:56.632258 | TASK [upload-logs : Ensure logs are readable before uploading] 2026-04-07 04:30:57.159793 | localhost -> localhost | ok: Runtime: 0:00:00.007947 2026-04-07 04:30:57.166575 | 2026-04-07 04:30:57.166747 | TASK [upload-logs : Upload logs to log server] 2026-04-07 04:30:57.753953 | localhost | Output suppressed because no_log was given 2026-04-07 04:30:57.758643 | 2026-04-07 04:30:57.758962 | LOOP [upload-logs : Compress console log and json output] 2026-04-07 04:30:57.818545 | localhost | skipping: Conditional result was False 2026-04-07 04:30:57.823600 | localhost | skipping: Conditional result was False 2026-04-07 04:30:57.831770 | 2026-04-07 04:30:57.832021 | LOOP [upload-logs : Upload compressed console log and json output] 2026-04-07 04:30:57.883444 | localhost | skipping: Conditional result was False 2026-04-07 04:30:57.884352 | 2026-04-07 04:30:57.887193 | localhost | skipping: Conditional result was False 2026-04-07 04:30:57.895136 | 2026-04-07 04:30:57.895328 | LOOP [upload-logs : Upload console log and json output]